| name | description | license |
|---|---|---|
new-airflow-rule |
This skill should be used when the user asks to "add a new airflow rule", "create an airflow lint rule", "implement an airflow inspection", "new AIR rule", or discusses creating Ruff linter rules in the airflow category. |
Apache-2.0 |
This skill guides creating new Airflow-specific lint rules (AIR prefix) in the Ruff codebase.
Before writing any code, ask the user the following questions (if not already answered in their request):
- Airflow version target:
Which version of Airflow does this rule target?
- Airflow 2 only
- Airflow 3 onward
- Both Airflow 2 and 3
- DAG API targeting (for general best-practice rules AIR001-099 only):
Airflow DAGs can be written using the TaskFlow API (decorators like
@task.branch) or the standard operator API (BranchPythonOperator). Both are equally supported. Should this rule target:
- TaskFlow API only (decorator-based)
- Operator API only (operator-based)
- Both (recommended — the rule should be DAG implementation-agnostic)
If both: the rule will need two entry points — a statement-level dispatch for decorated functions and an expression-level dispatch for operator calls. Use a shared helper for the core analysis logic, and a
Kindenum on the violation struct to produce context-specific diagnostic messages for each form.
- Local Airflow repository for validation:
Do you have a local clone of the Airflow repository? If so, provide the path (e.g.,
~/repositories/airflow).
If a local Airflow repo is available, use it after implementation to validate the rule against real-world code and check for false positives (see "Post-Implementation: Validate Against Airflow" below).
Code prefix validation: If a rule targets Airflow 3 (onward), its code MUST start with AIR3## (e.g., AIR301, AIR302, AIR311). If the user specified a code that doesn't follow this pattern, WARN them before proceeding.
Important distinction for AIR3xx rules: AIR3xx rules are migration rules that flag old-style (Airflow 2) imports/patterns to help migrate to Airflow 3. They should only match deprecated import paths — NOT the new airflow.sdk paths (which are the correct replacements). However, shared helpers that check context (e.g., "is this function decorated with @task?") should match both old and new paths, since deprecated patterns inside a task function need to be flagged regardless of which import style the decorator uses.
Understand which category your rule belongs to before picking a code:
| Range | Category | Description |
|---|---|---|
| AIR001-099 | General best-practice | Style, readability, common mistakes (not version-specific) |
| AIR301 | Removed in 3.0 | Symbols/args fully removed in Airflow 3.0 with no compat layer |
| AIR302 | Moved to provider in 3.0 | Symbols moved to external provider packages (required migration) |
| AIR303 | Signature change in 3.0 | Function/method signatures changed (args renamed, reordered, etc.) |
| AIR311 | Suggested update for 3.0 | Deprecated with compat layer — still works but will break later |
| AIR312 | Suggested provider move for 3.0 | Deprecated compat layer for provider migrations |
| AIR321 | Moved in 3.1 | Symbols moved/deprecated in Airflow 3.1 |
Rules that target both Airflow 2 and 3 must handle both old (deprecated) and new (airflow.sdk) import paths. Match against both in qualified_name.segments():
| Old Import Path (Deprecated) | New Import Path (airflow.sdk) |
|---|---|
airflow.decorators.dag |
airflow.sdk.dag |
airflow.decorators.task |
airflow.sdk.task |
airflow.decorators.task_group |
airflow.sdk.task_group |
airflow.decorators.setup |
airflow.sdk.setup |
airflow.decorators.teardown |
airflow.sdk.teardown |
airflow.models.dag.DAG |
airflow.sdk.DAG |
airflow.models.baseoperator.BaseOperator |
airflow.sdk.BaseOperator |
airflow.models.param.Param |
airflow.sdk.Param |
airflow.models.param.ParamsDict |
airflow.sdk.ParamsDict |
airflow.models.baseoperatorlink.BaseOperatorLink |
airflow.sdk.BaseOperatorLink |
airflow.sensors.base.BaseSensorOperator |
airflow.sdk.BaseSensorOperator |
airflow.hooks.base.BaseHook |
airflow.sdk.BaseHook |
airflow.notifications.basenotifier.BaseNotifier |
airflow.sdk.BaseNotifier |
airflow.utils.task_group.TaskGroup |
airflow.sdk.TaskGroup |
airflow.utils.context.Context |
airflow.sdk.Context |
airflow.datasets.Dataset |
airflow.sdk.Asset |
airflow.datasets.DatasetAlias |
airflow.sdk.AssetAlias |
airflow.datasets.DatasetAll |
airflow.sdk.AssetAll |
airflow.datasets.DatasetAny |
airflow.sdk.AssetAny |
airflow.models.connection.Connection |
airflow.sdk.Connection |
airflow.models.variable.Variable |
airflow.sdk.Variable |
airflow.io.* |
airflow.sdk.io.* |
Example pattern for matching both paths:
match qualified_name.segments() {
// Match both old and new import paths
["airflow", "decorators", "task"] | ["airflow", "sdk", "task"] => { /* ... */ }
["airflow", "models", "dag", "DAG"] | ["airflow", "sdk", "DAG"] => { /* ... */ }
_ => return,
}Follow these steps in order. Each step is mandatory.
Before writing rule logic, search for existing utilities that can be reused:
ruff_python_semantic: CheckSemanticModelmethods (e.g.,resolve_qualified_name,match_builtin_expr,match_typing_expr) andanalyze/visibility.rsfor decorator-checking patterns.ruff_python_ast: Checkhelpers.rsfor AST traversal utilities (e.g.,map_callable,ReturnStatementVisitor).crate::rules::airflow::helpers: Check for existing airflow-specific helpers (e.g.,is_guarded_by_try_except,is_airflow_builtin_or_provider,is_method_in_subclass,generate_import_edit).- Existing airflow rules: Check rules like AIR301 (
removal_in_3.rs) for patterns that may already exist or could be extracted.
If a pattern would be useful in multiple rules, extract it into helpers.rs as a shared utility rather than duplicating code. For example, AIR301 contains is_airflow_task() — if another rule needs the same check, move it to helpers.rs.
- Check existing codes in
crates/ruff_linter/src/codes.rsunder the// airflowsection. - Pick the next available code in the appropriate range.
- Name the struct following the convention: the name should make sense as "allow ${name}". For example,
TaskBranchAsShortCircuitreads as "allow task branch as short circuit". - Do NOT use prefixes like
DisalloworBanned.
Create crates/ruff_linter/src/rules/airflow/rules/<snake_case_name>.rs.
Choose the template below based on rule category.
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::{self as ast};
use ruff_python_semantic::Modules;
use ruff_text_size::Ranged;
use crate::Violation;
use crate::checkers::ast::Checker;
/// ## What it does
/// <One-line description of what the rule checks for.>
///
/// ## Why is this bad?
/// <Explanation of why the flagged pattern is problematic.>
///
/// ## Example
/// ```python
/// <Python code that triggers the violation>
/// ```
///
/// Use instead:
/// ```python
/// <Corrected Python code>
/// ```
#[derive(ViolationMetadata)]
#[violation_metadata(preview_since = "NEXT_RUFF_VERSION")]
pub(crate) struct MyRuleName;
impl Violation for MyRuleName {
#[derive_message_formats]
fn message(&self) -> String {
"<Diagnostic message shown to the user>".to_string()
}
}
/// AIRxxx
pub(crate) fn my_rule_name(checker: &Checker, /* appropriate AST node */) {
if !checker.semantic().seen_module(Modules::AIRFLOW) {
return;
}
// Rule logic here...
checker.report_diagnostic(MyRuleName, node.range());
}For rules that flag removed/moved/renamed symbols, use the existing infrastructure in helpers.rs:
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::{self as ast, Expr};
use ruff_python_semantic::Modules;
use ruff_text_size::Ranged;
use crate::{FixAvailability, Violation};
use crate::checkers::ast::Checker;
use crate::rules::airflow::helpers::{Replacement, is_guarded_by_try_except};
/// ## What it does
/// Checks for uses of deprecated Airflow symbols removed in Airflow X.Y.
///
/// ## Why is this bad?
/// These symbols were removed/moved in Airflow X.Y and will cause runtime errors.
///
/// ## Example / Use instead blocks...
#[derive(ViolationMetadata)]
#[violation_metadata(preview_since = "NEXT_RUFF_VERSION")]
pub(crate) struct MyMigrationRule {
deprecated: String,
replacement: String,
}
impl Violation for MyMigrationRule {
const FIX_AVAILABILITY: FixAvailability = FixAvailability::Sometimes;
#[derive_message_formats]
fn message(&self) -> String {
let MyMigrationRule { deprecated, replacement } = self;
format!("`{deprecated}` is removed in Airflow X.Y; use `{replacement}` instead")
}
fn fix_title(&self) -> Option<String> {
let MyMigrationRule { replacement, .. } = self;
Some(format!("Use `{replacement}`"))
}
}
pub(crate) fn my_migration_rule(checker: &Checker, expr: &Expr) {
if !checker.semantic().seen_module(Modules::AIRFLOW) {
return;
}
// Dispatch based on expression type:
match expr {
Expr::Attribute(ast::ExprAttribute { attr, .. }) => {
check_name(checker, expr, attr.range());
}
Expr::Name(_) => {
check_name(checker, expr, expr.range());
}
_ => {}
}
}
fn check_name(checker: &Checker, expr: &Expr, ranged: TextRange) {
let Some(qualified_name) = checker.semantic().resolve_qualified_name(expr) else {
return;
};
let (replacement, module, name) = match qualified_name.segments() {
["airflow", "old_module", "OldName"] => (
Replacement::Rename { module: "airflow.new_module", name: "NewName" },
"airflow.old_module",
"OldName",
),
_ => return,
};
// Skip if guarded by try-except (conditional import):
if is_guarded_by_try_except(expr, module, name, checker.semantic()) {
return;
}
let mut diagnostic = checker.report_diagnostic(
MyMigrationRule {
deprecated: name.to_string(),
replacement: replacement.to_string(),
},
ranged,
);
// Optionally generate a fix:
// if let Some(fix) = generate_import_edit(...) {
// diagnostic.set_fix(fix);
// }
}In crates/ruff_linter/src/codes.rs, add an entry under the // airflow section:
(Airflow, "xxx") => rules::airflow::rules::MyRuleName,Keep the entries sorted by code number.
In crates/ruff_linter/src/rules/airflow/rules/mod.rs:
- Add
pub(crate) use my_rule_name::*;in the use-declarations block (alphabetical order). - Add
mod my_rule_name;in the mod-declarations block (alphabetical order).
Do NOT add a duplicate module declaration in crates/ruff_linter/src/rules/airflow/mod.rs -- only rules/mod.rs needs it.
In crates/ruff_linter/src/checkers/ast/analyze/:
- For statement-based rules (function defs, assignments, class defs): add to
statement.rs - For expression-based rules (function calls, attribute access): add to
expression.rs
Pattern:
if checker.is_rule_enabled(Rule::MyRuleName) {
airflow::rules::my_rule_name(checker, node);
}Place the dispatch near other airflow rule dispatches for consistency.
Create crates/ruff_linter/resources/test/fixtures/airflow/AIRxxx.py.
The fixture must include:
- Cases that SHOULD trigger the rule (with
# AIRxxxcomments) - Cases that should NOT trigger the rule (edge cases, similar but valid patterns)
- For migration rules: cases guarded by try-except (should NOT trigger)
In crates/ruff_linter/src/rules/airflow/mod.rs, add a #[test_case] line:
#[test_case(Rule::MyRuleName, Path::new("AIRxxx.py"))]Keep test cases sorted by rule code.
Always run cargo fmt before testing or committing. Rust formatting issues (import ordering, closure formatting, method chain line breaks) will cause CI failures.
cargo fmt -p ruff_linter# Verify output manually first:
cargo run -p ruff -- check crates/ruff_linter/resources/test/fixtures/airflow/AIRxxx.py --no-cache --preview --select AIRxxx
# Run the test with RUFF_UPDATE_SCHEMA=1 to auto-update schemas (will fail first time, generating a snapshot):
RUFF_UPDATE_SCHEMA=1 cargo nextest run -p ruff_linter -- "airflow::tests"
# Accept the snapshot:
cargo insta accept
# Verify the test passes:
RUFF_UPDATE_SCHEMA=1 cargo nextest run -p ruff_linter -- "airflow::tests"cargo dev generate-allcargo clippy -p ruff_linter --all-targets --all-features -- -D warnings
uvx prek run -aIf the user provided a path to a local Airflow repository during pre-implementation, run the new rule against it to check for false positives:
cargo run -p ruff -- check <airflow_repo_path> --no-cache --preview --select AIRxxxReview the output:
- True positives: Violations that correctly flag the anti-pattern. Report the count and sample locations.
- False positives: Violations that flag code that is actually correct. If found:
- Identify the pattern causing the false positive.
- Update the rule logic to exclude it (e.g., add a guard clause).
- Add the false-positive pattern as a non-violation test case in the fixture.
- Re-run steps 8–11 to update snapshots and verify.
Report findings to the user before finalizing.
The crates/ruff_linter/src/rules/airflow/helpers.rs module provides shared utilities. Import from crate::rules::airflow::helpers.
Used by migration rules to describe what replaces a deprecated symbol:
-
Replacement— for builtin/SDK moves:None— no replacement availableMessage(&'static str)— custom message, no auto-fixAttrName(&'static str)— attribute renamed (e.g.,datasettoasset)Rename { module, name }— moved to new module with new nameSourceModuleMoved { module, name }— module changed, name staysSourceModuleMovedToSDK { module, name, version }— moved to SDKSourceModuleMovedWithMessage { module, name, message, suggest_fix }— with custom message
-
ProviderReplacement— for provider migrations (AIR302/312):Rename { module, name, provider, version }— moved to provider with renameSourceModuleMovedToProvider { module, name, provider, version }— module changed
-
FunctionSignatureChange— for AIR303:Message(&'static str)— describes the signature change
Prevents false positives when symbols are conditionally imported:
use crate::rules::airflow::helpers::is_guarded_by_try_except;
// Skip if the usage is inside a try-except that catches ImportError/AttributeError
if is_guarded_by_try_except(expr, "airflow.old_module", "OldName", checker.semantic()) {
return;
}This checks whether the expression is in a try-except block that:
- For imports: catches
ImportErrororModuleNotFoundError, and the try block imports from the new location - For attributes: catches
AttributeError, and the try block accesses the new attribute
use crate::rules::airflow::helpers::{generate_import_edit, generate_remove_and_runtime_import_edit};
// When symbol name changes (e.g., Dataset -> Asset):
if let Some(fix) = generate_import_edit(checker, stmt, "old_name", "new_module", "new_name") {
diagnostic.set_fix(fix); // Safe edit
}
// When module changes but name stays (provider migration):
if let Some(fix) = generate_remove_and_runtime_import_edit(checker, stmt, "new_module", "name") {
diagnostic.set_fix(fix); // Unsafe edit
}use crate::rules::airflow::helpers::{is_airflow_builtin_or_provider, is_method_in_subclass};
// Check if qualified name matches airflow.<module>.**.*<suffix> or provider equivalent:
is_airflow_builtin_or_provider(segments, "operators", "Operator")
is_airflow_builtin_or_provider(segments, "secrets", "Backend")
is_airflow_builtin_or_provider(segments, "hooks", "Hook")
// Check if a method is defined in a subclass of a specific base:
is_method_in_subclass(function_def, semantic, "execute", |qn| {
matches!(qn.segments(), ["airflow", "models" | "sdk", .., "BaseOperator"])
})Note: BaseOperator task-execution-time methods include execute, pre_execute, and post_execute.
All three run at task execution time (not DAG parse time).
When checking if a class inherits from a base class, use any_qualified_base_class from ruff_python_semantic::analyze::class instead of directly iterating class_def.bases(). This handles transitive inheritance (e.g., class A(BaseOperator) → class B(A) → class C(B)):
use ruff_python_semantic::analyze::class::any_qualified_base_class;
any_qualified_base_class(class_def, semantic, &|qn| {
matches!(qn.segments(), ["airflow", "models" | "sdk", .., "BaseOperator"])
})To check if a file is a Dag definition file, check for imports of DAG or dag from airflow. This is simpler and more reliable than checking for actual DAG() calls or @dag decorators, since types must be imported before use:
fn is_dag_file(semantic: &SemanticModel) -> bool {
semantic.global_scope().binding_ids().any(|binding_id| {
semantic
.binding(binding_id)
.as_any_import()
.is_some_and(|import| {
matches!(
import.qualified_name().segments(),
["airflow", .., "DAG" | "dag"]
)
})
})
}When determining whether code is at module level vs inside a function (e.g., to vary diagnostic messages), prefer semantic.current_scope().kind over semantic.current_statements().any(...). The scope approach correctly handles nested classes inside functions:
let in_function = matches!(
checker.semantic().current_scope().kind,
ScopeKind::Function(_) | ScopeKind::Lambda(_)
);current_scope() returns the innermost scope, so Variable.get() inside a class body nested in a function returns ScopeKind::Class (not function).
Important: Before writing a new decorator-checking function, check if one already exists in helpers.rs (e.g., is_airflow_task in AIR301). If a decorator check is needed by multiple rules, extract it to helpers.rs as a shared utility.
Use map_callable to handle both @decorator and @decorator() forms:
use ruff_python_ast::helpers::map_callable;
fn has_some_decorator(function_def: &StmtFunctionDef, checker: &Checker) -> bool {
function_def.decorator_list.iter().any(|decorator| {
let expr = map_callable(&decorator.expression);
checker
.semantic()
.resolve_qualified_name(expr)
.is_some_and(|qn| matches!(qn.segments(), ["airflow", "decorators", "some_name"]))
})
}For rules targeting both Airflow 2 and 3, match both old and new decorator paths:
fn is_airflow_task(function_def: &StmtFunctionDef, semantic: &SemanticModel) -> bool {
function_def.decorator_list.iter().any(|decorator| {
semantic
.resolve_qualified_name(map_callable(&decorator.expression))
.is_some_and(|qn| matches!(qn.segments(),
["airflow", "decorators", "task"] | ["airflow", "sdk", "task"]
))
})
}For attribute-style decorators like @task.branch, check the Expr::Attribute and resolve the value part:
let expr = map_callable(&decorator.expression);
if let Expr::Attribute(ast::ExprAttribute { value, attr, .. }) = expr {
if attr.as_str() == "branch" {
checker.semantic().resolve_qualified_name(value)
.is_some_and(|qn| matches!(qn.segments(), ["airflow", "decorators", "task"]))
}
}Also see ruff_python_semantic::analyze::visibility for generic decorator utilities: is_staticmethod, is_classmethod, is_overload, is_abstract, is_property, etc.
When a rule must detect the same anti-pattern in both @task.<variant> decorated functions and operator callables (e.g., BranchPythonOperator(python_callable=func)), use this architecture:
- Shared analysis helper — extract the core logic into a private function that operates on a function body:
fn could_be_short_circuit(body: &[Stmt]) -> bool { /* ... */ }- Violation enum — add a
Kindenum to produce context-specific messages:
pub(crate) struct MyRule { kind: MyKind }
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
enum MyKind { Decorator, Operator }- Statement-level entry point (for
@task.<variant>decorators) — dispatched fromstatement.rsonStmtFunctionDef:
pub(crate) fn my_rule_decorator(checker: &Checker, function_def: &StmtFunctionDef) {
// Check decorator, then call shared helper on function_def.body
}- Expression-level entry point (for operator calls) — dispatched from
expression.rsonExprCall. Resolve thepython_callableargument to the function definition using the semantic model:
pub(crate) fn my_rule_operator(checker: &Checker, call: &ExprCall) {
// 1. Resolve call.func to qualified name, match operator paths
// 2. Extract python_callable keyword argument
// 3. Resolve name to function definition:
let Expr::Name(name_expr) = &keyword.value else { return; };
let Some(binding_id) = semantic.only_binding(name_expr) else { return; };
let BindingKind::FunctionDefinition(scope_id) = semantic.binding(binding_id).kind else { return; };
let ScopeKind::Function(function_def) = semantic.scopes[scope_id].kind else { return; };
// 4. Call shared helper on function_def.body
}Operator import paths to match (example for BranchPythonOperator):
["airflow", "operators", "python", "BranchPythonOperator"]["airflow", "operators", "python_operator", "BranchPythonOperator"](legacy)["airflow", "providers", "standard", "operators", "python", "BranchPythonOperator"]
See AIR003 (task_branch_as_short_circuit.rs) for a complete working example.
let Expr::Call(ast::ExprCall { func, arguments, .. }) = expr else { return; };
// Resolve the function being called:
checker
.semantic()
.resolve_qualified_name(func)
.is_some_and(|qn| matches!(qn.segments(), ["airflow", .., "SomeClass"]))
// Check keyword arguments:
if let Some(keyword) = arguments.find_keyword("some_arg") {
// keyword.value is the argument value expression
}match qualified_name.segments() {
// Builtin operators
["airflow", "operators", ..] => true,
// Provider operators (operators must appear somewhere in the middle)
["airflow", "providers", rest @ ..] => {
rest.iter().position(|&s| s == "operators")
.is_some_and(|pos| pos + 1 < rest.len())
}
_ => false,
}Use ReturnStatementVisitor to find all returns including those in nested blocks:
use ruff_python_ast::helpers::ReturnStatementVisitor;
use ruff_python_ast::visitor::Visitor;
let mut visitor = ReturnStatementVisitor::default();
for stmt in &function_def.body {
visitor.visit_stmt(stmt);
}
let returns = &visitor.returns;Migration rules typically dispatch on expression type:
match expr {
Expr::Call(ast::ExprCall { func, arguments, .. }) => {
check_call_arguments(checker, func, arguments);
}
Expr::Attribute(ast::ExprAttribute { attr, .. }) => {
check_name(checker, expr, attr.range());
}
Expr::Name(_) => {
check_name(checker, expr, expr.range());
}
Expr::Subscript(ast::ExprSubscript { value, slice, .. }) => {
check_subscript_access(checker, value, slice);
}
_ => {}
}For migration rules, use struct fields to parameterize messages:
pub(crate) struct MyRule {
deprecated: String,
replacement: String,
}
impl Violation for MyRule {
const FIX_AVAILABILITY: FixAvailability = FixAvailability::Sometimes;
#[derive_message_formats]
fn message(&self) -> String {
let MyRule { deprecated, replacement } = self;
format!("`{deprecated}` is removed; use `{replacement}`")
}
fn fix_title(&self) -> Option<String> {
let MyRule { replacement, .. } = self;
Some(format!("Use `{replacement}`"))
}
}When adding new helper functions or methods, place the higher-level predicates or public-facing helpers above the lower-level utilities they call. That way reviewers see the intent/entry point first, followed by the supporting helpers (e.g., in_airflow_task_function before is_airflow_task).
Rule documentation (the doc comments in the rule struct) must follow these conventions:
Use this exact section order (include only sections that apply):
## What it does(required) — One-line description of what the rule checks for## Why is this bad?(required) — Explanation of why the pattern is problematic## Example(required) — Python code that triggers the violationUse instead:(required) — Corrected Python code## Fix safety(optional) — Notes about fix edge cases or when fixes might be unsafe## Options(optional) — Configuration settings that affect the rule## References(optional) — Links to external documentation
-
Dag (capitalization):
- Use
DAGwhen referring to the class: "TheDAG()constructor..." - Use
@dagwhen referring to the decorator: "Functions decorated with@dag..." - Use "Dag" (proper noun, no backticks) when referring to a workflow as a general concept: "If your Dag does not have...", "the serialized Dag hash"
- Use
-
Task terminology:
- Use
@taskfor the decorator - Use "task" (lowercase, no backticks) for the general concept
- Use specific operator names in backticks:
PythonOperator,BranchPythonOperator
- Use
-
Airflow versions: Write as "Airflow 2", "Airflow 3", "Airflow 3.0", "Airflow 3.1" (capitalize, no backticks)
-
Code references: Always use backticks for:
- Class names:
DAG,BaseOperator - Function/method names:
execute(),datetime.now() - Decorators:
@dag(),@task.branch - Parameters/arguments:
schedule,task_id,python_callable - Module paths:
airflow.operators.python,airflow.sdk - String literals in explanations:
`"schedule"`
- Class names:
- Conciseness: Keep "What it does" to one sentence. Expand details in "Why is this bad?"
- Active voice: Prefer "Using X causes..." over "X may cause..." or "It is possible that X causes..."
- Imperative starts: Begin sentences in "Why is this bad?" with action verbs: "Using...", "This leads to...", "These symbols were removed..."
- Specific consequences: Don't just say "this is bad practice"—explain the actual impact (performance, compatibility, maintainability)
- Code examples: Keep them minimal but complete enough to demonstrate the issue. Include necessary imports.
Error messages (the message() method) should:
- Be a generic description of the problem — do NOT include context-specific suggestions in the message
- Use backticks around code symbols:
"`{deprecated}` is removed in Airflow 3.0" - Avoid starting with "Checks for" or "Detects" (this is for documentation, not messages)
Fix titles (the fix_title() method) should:
- Contain context-specific suggestions (e.g., "Use Jinja templates" vs "Move into a
@task-decorated function") - Start with an imperative verb: "Use
schedule", "Replace with...", "Remove..." - Be very brief (2-5 words when possible)
- Even without an actual auto-fix,
fix_title()is displayed as a separatehelp: ...line in diagnostics
Pattern: Generic message + context-specific fix title (from AIR003):
fn message(&self) -> String {
"`Variable.get()` outside of a task".to_string() // Generic
}
fn fix_title(&self) -> Option<String> {
if self.in_function {
Some("Move into a `@task`-decorated function".to_string()) // Context-specific
} else {
Some("Use Jinja templates instead".to_string()) // Context-specific
}
}Good documentation structure (from AIR002):
/// ## What it does
/// Checks for a `DAG()` class or `@dag()` decorator without an explicit
/// `schedule` parameter.
///
/// ## Why is this bad?
/// The default value of the `schedule` parameter on Airflow 2 is
/// `timedelta(days=1)`, which is almost never what a user is looking for.
/// Airflow 3 changed the default value to `None`, which would break
/// existing Dags using the implicit default.
///
/// ## Example
/// ```python
/// from airflow import DAG
///
/// # Using the implicit default schedule.
/// dag = DAG(dag_id="my_dag")
/// ```
///
/// Use instead:
/// ```python
/// from datetime import timedelta
/// from airflow import DAG
///
/// dag = DAG(dag_id="my_dag", schedule=timedelta(days=1))
/// ```
Note the use of:
- "Dag" (proper noun) in prose: "would break existing Dags"
`DAG()`and`@dag()`for class and decorator- Backticks for parameters:
`schedule`,`timedelta(days=1)` - "Airflow 2" and "Airflow 3" (capitalized, no backticks)
- Use
checker.report_diagnostic(ViolationStruct, range)— NOTDiagnostic::new(). - Add
#[violation_metadata(preview_since = "NEXT_RUFF_VERSION")]for new rules. - Always guard with
checker.semantic().seen_module(Modules::AIRFLOW). - For migration rules, use
is_guarded_by_try_exceptto avoid false positives on conditional imports. - For rules targeting both Airflow 2 and 3, match both old and new (
airflow.sdk) import paths (see mapping table above). - Reuse over duplication: Before writing a utility function, search
ruff_python_semantic,ruff_python_ast, andairflow/helpers.rsfor existing implementations. If a pattern is useful in multiple rules, add it tohelpers.rsrather than keeping it local to one rule. - Follow early-return style (guard clauses) rather than deeply nested if-let chains.
- Prefer let chains (
if letcombined with&&) over nestedif letwhen possible. - Avoid
panic!,unreachable!, or.unwrap(). - Use
#[expect()]over#[allow()]for suppressing clippy lints. - For internal (non-public) functions, implementation notes (e.g., "this is similar to X but can't reuse it because...") should be
///doc comments, not//comments.