- new
- past
- show
- ask
- show
- jobs
- submit
I've ended up building my workflow engine directly in Python, despite YAML being the default choice for LLMs.
I found that YAML had some drawbacks:
* LLMs don't have an inherent understanding of YAML conventions. They tend to be overly verbose. Python code solved this because "good" code is generally as short as you need.
* YAML isn't really composable. Yes, you can technically compose it, but you'll be fighting the LLM the entire time. Python solved this because the LLM knows how to decouple code.
* I want _some_ things to be programatic still. Having Python solves that
* Pretty much any programming language would do. Python just feels like the default for LLM-centric code.
There used to be a project called Benthos (since acquired and rebranded by Redpanda in 2024) that was amazing, that you might want to gain some inspiration from.
However, durable workflows have also gained popular acceptance as functional design reaches a wider audience.
While Temporal is the most popular choice when it comes to durable workflows, DBOS (cofounded by the father of PostgreSQL) is my personal favorite.
At the moment, orchestration in DBOS has certain gaps - you might very well consider spending your effort on closing those gaps. The value there would be phenomenal!
That said, DBOS really makes durable workflows accessible and approachable. Having already used Temporal, I think you're really appreciate how quickly you can get started with DBOS. I forget if they support SQLite but if you have a PostgreSQL server set up, you really don't need anything else to write your first few DBOS durable workflows (vs. needing a Temporal server or cluster)
Let me know if I got you interested to try it out. I first learned about Temporal from Mitchell Hashimoto as they were using it for Hashicorp Cloud. Eventually I discovered DBOS and now all my personal projects are on DBOS.
There’s something about seeing the ground truth, in full, in one place, when you’re trying to understand it, or troubleshoot it.
P.S. I'm the author of a similar solution:
There's no need for humans to write DAGs anymore, yet they remain human-readable. I truly believe this is the future of workflow orchestration.
That is a is a pretty bold claim for a repo that existed for a few days, has 0 issues, PRs, etc...
Right??
There was a study recently that LLms prefer resumes written by LLMs rather than by humans. Stands to reason they would prefer apis written by LLMs.
This is probably the early days of such intentionally simplified agentic semantic primitives like “DAG Workflow” where the answer for why not Temporal is that LLMs prefer different things than humans.
If you want to roll your own, you build a dependency graph (a dict) of the functions you want to call, Python already has graphlib.TopologicalSorter built in that can do this for you. Throw in logging and the tenacity library for retries and you’re set.
I've used Dagster but I can't compare to airflow. But in terms of DX, I've found Dagster pretty easy to use. Instead of writing their own DSL, they have a python library that lets yo tag your pre-made methods as @ops and and string them together into a DAG.
That being said, that's not this project.
I want something that uses BPML for actual business workflows.
Just seeing YAML used for workflows in this age makes me automatically nope out.
But that's only the start. There are a lot of other things I would expect of a new workflow orchestrator in 2026 so if you are not comparing yourself to the competition you probably don't know what you're getting yourself into.
I'm mainly focusing on the portability aspect of it (e.g. use TS/Python/etc. to define the workflow/steps or just simple a simple YAML file).
I recommend checking out https://github.com/peterkelly/rex and also my PhD thesis on the topic https://www.pmkelly.net/publications/thesis.pdf.
The gap in flexiblity between DAG-only and a full language designed for the task is a significant one.
That's kind of my (not the project's) vision for PRQL - a general LINQ type embeddable data transformation language.
Unfortunately no time to work on it these days.
https://insitro.github.io/redun/
Fun fact: a DAG, after topological sorting, is a list
Many people need the efficiency of running things in parallel. But if you don’t (like if you’re running reporting/ETL stuff overnight), you can skip a lot of the complexity and just run a list of tasks in the right order.
Or put another way, before you adopt a DAG orchestrator (and all the time evaluating your options), remember you can just run the same steps as a list and get something shipped, and the DAG stuff is an optimization you can tackle in phase 2.