Dagu
About this project
Dagu is a lightweight workflow engine with a modern Web UI. Workflows are defined in a simple, declarative YAML format and can be executed on schedule. It supports shell commands, remote execution via SSH, and Docker images. Built-in error notifications and logging out of the box.
For a quick feel of how it works, take a look at the examples.
Motivation
Legacy systems often have complex and implicit dependencies between jobs. When there are hundreds of cron jobs on a server, it can be difficult to keep track of these dependencies and to determine which job to rerun if one fails. It can also be a hassle to SSH into a server to view logs and manually rerun shell scripts one by one. Dagu aims to solve these problems by allowing you to explicitly visualize and manage pipeline dependencies as a DAG, and by providing a web UI for checking dependencies, execution status, and logs and for rerunning or stopping jobs with a simple mouse click.
Why Not Use an Existing Workflow Scheduler Like Airflow?
There are many existing tools such as Airflow, but many of these require you to write code in a programming language like Python to define your DAG. For systems that have been in operation for a long time, there may already be complex jobs with hundreds of thousands of lines of code written in languages like Perl or Shell Script. Adding another layer of complexity on top of these codes can reduce maintainability. Dagu was designed to be easy to use, self-contained, and require no coding, making it ideal for small projects.
How it Works
Dagu executes your workflows defined in a simple, declarative YAML format.
For example, a simple sequential DAG:
schedule: "0 0 * * *" # Runs at 00:00 everyday
steps:
- echo "Hello, dagu!"
- echo "This is a second step"
Highlights
- Install by placing a single binary with zero dependency
- Run without DBMS o