Visualize Cron Schedules in Databricks and Apache Airflow
Table of Contents
Databricks Jobs use a Quartz-compatible cron format, while Apache Airflow uses standard 5-field cron with additional @daily and similar macros. Both run in UTC by default. Paste your expression into our free crontab visualizer (after adapting Databricks Quartz format) to verify next run times before deploying.
This guide covers cron scheduling for data engineers working with Databricks, Airflow, dbt, and Prefect.
Cron Scheduling in Apache Airflow
Airflow DAGs use the schedule_interval (Airflow 1.x) or schedule (Airflow 2.4+) parameter. It accepts standard 5-field cron expressions and preset strings:
from airflow import DAG
from datetime import datetime
dag = DAG(
'my_pipeline',
schedule='0 6 * * 1-5', # 6 AM UTC weekdays
start_date=datetime(2026, 1, 1),
catchup=False
)
Airflow preset macros:
| Preset | Cron equivalent | English |
|---|---|---|
@once | — | Run once, then never again |
@hourly | 0 * * * * | Every hour at :00 |
@daily | 0 0 * * * | Every day at midnight UTC |
@weekly | 0 0 * * 0 | Every Sunday midnight UTC |
@monthly | 0 0 1 * * | 1st of month midnight UTC |
@yearly | 0 0 1 1 * | January 1st midnight UTC |
Critical Airflow timing behavior: Airflow's execution date is the start of the interval, not when the job actually runs. A daily DAG with schedule='@daily' and start_date=2026-01-01 doesn't execute the 2026-01-01 run until midnight on January 2nd. This "T+1" behavior surprises many data engineers. Set catchup=False to avoid backfilling all historical runs on first deployment.
Cron Scheduling in Databricks Jobs
Databricks Jobs use Quartz cron syntax (6-field: seconds minute hour day month weekday). Configure via the UI, the Jobs REST API, or terraform:
# Databricks Jobs API — quartz_cron_expression
{
"name": "Daily ETL",
"schedule": {
"quartz_cron_expression": "0 0 6 * * ?",
"timezone_id": "UTC"
}
}
The Quartz format adds seconds as the first field. 0 0 6 * * ? means: seconds=0, minutes=0, hours=6, every day (*), every month (*), day-of-week=? (no specific day). In standard 5-field cron: 0 6 * * * — every day at 6 AM.
To use the visualizer with Databricks expressions: drop the first field (seconds), replace ? with *, and paste the remaining 5 fields.
Databricks timezone support: The timezone_id field accepts IANA timezone names (America/New_York, Asia/Tokyo, etc.). Unlike GitHub Actions, you can specify a timezone — the schedule runs in that timezone, handling DST automatically.
Common Data Pipeline Cron Schedules for Airflow and Databricks
| Goal | 5-field (Airflow) | 6-field (Databricks Quartz) |
|---|---|---|
| Daily at 6 AM UTC | 0 6 * * * | 0 0 6 * * ? |
| Hourly | 0 * * * * | 0 0 * * * ? |
| Every 30 minutes | */30 * * * * | 0 */30 * * * ? |
| Weekdays 8 AM UTC | 0 8 * * 1-5 | 0 0 8 ? * MON-FRI |
| Weekly Sunday midnight | 0 0 * * 0 | 0 0 0 ? * SUN |
| First of month midnight | 0 0 1 * * | 0 0 0 1 * ? |
| Twice daily (6 AM, 6 PM) | 0 6,18 * * * | 0 0 6,18 * * ? |
Paste the 5-field version into the crontab visualizer to see the next 20 run times and confirm the schedule is correct before configuring the pipeline.
Cron Scheduling in dbt Cloud and Prefect
dbt Cloud: Uses standard 5-field cron in the job schedule settings. All times are UTC. The dbt Cloud UI has a visual schedule builder, but if you prefer writing expressions directly, use the cron input and validate with the visualizer.
Prefect 2.x: Uses Python crontab or RRuleSchedule objects. For simple schedules, CronSchedule accepts standard 5-field cron:
from prefect.schedules import CronSchedule
@flow(
description="Daily ETL",
schedule=CronSchedule(cron="0 6 * * 1-5", timezone="America/New_York")
)
def my_flow():
pass
Prefect supports the timezone parameter with IANA names, handling DST automatically — similar to Databricks.
For all these platforms: validate the 5-field cron expression in the visualizer before deploying. A misconfigured schedule that runs at 3 AM instead of 6 AM won't be obvious until you check the run history.
Try It Free — No Signup Required
Runs 100% in your browser. No account, no install, no limits.
Open Free Crontab VisualizerFrequently Asked Questions
Does Airflow use UTC for cron schedules?
By default, yes — Airflow runs schedules in UTC. In Airflow 2.2+, you can set a default timezone for the Airflow cluster in airflow.cfg (default_timezone setting), and individual DAGs can specify timezone using the DAG's timezone parameter. Without explicit configuration, treat all Airflow schedules as UTC.
What is the quartz_cron_expression format in Databricks?
Databricks uses Quartz cron format, which is a 6-field extension of standard cron: "seconds minute hour day-of-month month day-of-week." The seconds field (always 0 for minute-level scheduling) is the first field. Use ? in either day-of-month or day-of-week (not both). Example: "0 0 6 * * ?" means every day at 6:00:00 AM.
How do I schedule an Airflow DAG to run every weekday at a specific local time?
Convert your local time to UTC (since Airflow defaults to UTC), then write the cron expression: for 9 AM EST, use "0 14 * * 1-5" in winter or "0 13 * * 1-5" in summer (EDT). In Airflow 2.2+, you can avoid manual UTC conversion by setting the DAG's timezone parameter: DAG(schedule="0 9 * * 1-5", timezone="America/New_York") — Airflow handles DST automatically.

