dbt-cloud-repo/ # github.com/myorg/dbt-cloud-repo
├── .github/
│ └── workflows/
│ └── dbt_cloud_deploy.yml # dbt Cloud CI/CD
├── dbt_project.yml
├── profiles.yml # (or managed in dbt Cloud UI)
├── packages.yml
├── models/
│ ├── staging/
│ │ ├── stg_customers.sql
│ │ └── stg_orders.sql
│ └── marts/
│ ├── dim_customers.sql
│ └── fct_orders.sql
├── macros/
│ └── generate_schema_name.sql
├── seeds/
│ └── country_codes.csv
├── tests/
│ └── assert_positive_revenue.sql
├── snapshots/
│ └── scd_customers.sql
└── README.md
dagster-cloud-repo/ # github.com/myorg/dagster-cloud-repo
├── .github/
│ └── workflows/
│ ├── dagster_cloud_deploy.yml # Dagster Cloud CI/CD
├── my_dagster_project/
│ ├── pyproject.toml
│ └── my_dagster_project/
│ ├── __init__.py
│ └── defs/
│ ├── __init__.py
│ ├── ingestion/
│ │ ├── defs.yaml # e.g., Sling or Fivetran component
│ │ └── replication.yaml
│ ├── ml_models/
│ │ ├── train_model.py
│ │ └── serve_predictions.py
│ ├── exports/
│ │ └── export_customers.py
│ └── schedules/
│ └── daily_pipeline.py
├── dagster_cloud.yaml # Dagster Cloud location config
└── README.md
You have two options for which repo becomes the "home" repo:
| Strategy | When to Use |
|---|---|
| A. dbt repo absorbs Dagster | dbt Cloud repo is the primary codebase; Dagster is the newer addition |
| B. Dagster repo absorbs dbt | Dagster Cloud repo is the primary codebase; dbt models are being brought in |
Both produce the same final structure. This guide uses Strategy A (dbt repo as the base) since it is the more common scenario. For Strategy B, simply reverse the roles.
This moves the Dagster project into the dbt Cloud repo while preserving full git history for both repos.
# 1. Clone the dbt Cloud repo (the base repo)
git clone git@github.com:myorg/dbt-cloud-repo.git merged-repo
cd merged-repo
# 2. Add the Dagster repo as a remote
git remote add dagster-origin git@github.com:myorg/dagster-cloud-repo.git
git fetch dagster-origin
# 3. Merge the Dagster repo into the dbt repo (allow unrelated histories)
git merge dagster-origin/main --allow-unrelated-histories --no-editIf there are merge conflicts (e.g., both repos have a README.md or .github/ files), resolve
them manually:
# Common conflicts to expect:
# - README.md → combine both into one
# - .github/workflows/ → keep both sets of workflows (no overlap)
# - .gitignore → merge entries from both
# After resolving:
git add .
git commit -m "Merge dagster-cloud-repo into dbt-cloud-repo"Clean up the remote:
git remote remove dagster-originAfter the merge, both projects' files exist at the repo root. Move the dbt project into its own subdirectory so the repo has clear top-level boundaries:
# Create a subdirectory for the dbt project
mkdir dbt_project
# Move all dbt-specific files into it
git mv dbt_project.yml dbt_project/
git mv profiles.yml dbt_project/ 2>/dev/null # may not exist if managed by dbt Cloud
git mv packages.yml dbt_project/ 2>/dev/null
git mv models/ dbt_project/
git mv macros/ dbt_project/
git mv seeds/ dbt_project/
git mv tests/ dbt_project/
git mv snapshots/ dbt_project/
git mv analyses/ dbt_project/ 2>/dev/null
git commit -m "Move dbt project files into dbt_project/ subdirectory"merged-repo/
├── .github/
│ └── workflows/
│ ├── dbt_cloud_deploy.yml # from dbt Cloud repo (update paths — see Step 5)
│ ├── dagster_cloud_deploy.yml # from Dagster Cloud repo
│ └── dagster_tests.yml # from Dagster Cloud repo
│
├── dbt_project/ # ── dbt Cloud project ──
│ ├── dbt_project.yml
│ ├── profiles.yml
│ ├── packages.yml
│ ├── models/
│ │ ├── staging/
│ │ │ ├── stg_customers.sql
│ │ │ └── stg_orders.sql
│ │ └── marts/
│ │ ├── dim_customers.sql
│ │ └── fct_orders.sql
│ ├── macros/
│ │ └── generate_schema_name.sql
│ ├── seeds/
│ │ └── country_codes.csv
│ ├── tests/
│ │ └── assert_positive_revenue.sql
│ └── snapshots/
│ └── scd_customers.sql
│
├── my_dagster_project/ # ── Dagster Cloud project ──
│ ├── pyproject.toml
│ └── my_dagster_project/
│ ├── __init__.py
│ └── defs/
│ ├── __init__.py
│ ├── ingestion/
│ │ ├── defs.yaml
│ │ └── replication.yaml
│ ├── ml_models/
│ │ ├── train_model.py
│ │ └── serve_predictions.py
│ ├── exports/
│ │ └── export_customers.py
│ └── schedules/
│ └── daily_pipeline.py
│
├── dagster_cloud.yaml # Dagster Cloud location config
├── .gitignore
└── README.md
Since dbt files moved from the repo root into dbt_project/, update dbt Cloud:
- Go to Account Settings > Projects > your project
- Change Project Subdirectory to
dbt_project - This tells dbt Cloud to look for
dbt_project.ymlinside that folder
Edit .github/workflows/dbt_cloud_deploy.yml to reflect the new paths:
on:
push:
branches: [main]
paths:
- "dbt_project/models/**"
- "dbt_project/macros/**"
- "dbt_project/seeds/**"
- "dbt_project/snapshots/**"
- "dbt_project/tests/**"
- "dbt_project/analyses/**"
- "dbt_project/dbt_project.yml"
- "dbt_project/packages.yml"
pull_request:
branches: [main]
paths:
- "dbt_project/models/**"
- "dbt_project/macros/**"
- "dbt_project/seeds/**"
- "dbt_project/snapshots/**"
- "dbt_project/tests/**"
- "dbt_project/analyses/**"
- "dbt_project/dbt_project.yml"
- "dbt_project/packages.yml"Edit dagster_cloud.yaml to ensure the code location points to the correct directory:
locations:
- location_name: my-dagster-project
code_source:
package_name: my_dagster_project
working_directory: my_dagster_projectcd my_dagster_project
# Verify Dagster can load all definitions (including dbt models)
dg list defs
# Launch the Dagster UI locally
dg devCheck that:
- All existing Dagster assets still appear
- dbt models appear as Dagster assets (via Option A or B)
- Dependency lineage connects dbt models to downstream assets
- Schedules and sensors are intact
merged-repo/
├── .github/
│ └── workflows/
│ ├── dbt_cloud_deploy.yml # dbt Cloud CI/CD (paths updated)
│ ├── dagster_cloud_deploy.yml # Dagster Cloud CI/CD
│ └── dagster_tests.yml # Dagster PR checks
│
├── dbt_project/ # ── dbt Cloud project ──
│ ├── dbt_project.yml
│ ├── profiles.yml
│ ├── packages.yml
│ ├── models/
│ │ ├── staging/
│ │ │ ├── stg_customers.sql
│ │ │ └── stg_orders.sql
│ │ └── marts/
│ │ ├── dim_customers.sql
│ │ └── fct_orders.sql
│ ├── macros/
│ │ └── generate_schema_name.sql
│ ├── seeds/
│ │ └── country_codes.csv
│ ├── tests/
│ │ └── assert_positive_revenue.sql
│ └── snapshots/
│ └── scd_customers.sql
│
├── my_dagster_project/ # ── Dagster Cloud project ──
│ ├── pyproject.toml
│ └── my_dagster_project/
│ ├── __init__.py
│ └── defs/
│ ├── __init__.py
│ ├── my_dbt/ # dbt integration (Option A)
│ │ └── defs.yaml
│ ├── dbt_cloud_assets.py # OR dbt Cloud API (Option B)
│ ├── ingestion/
│ │ ├── defs.yaml
│ │ └── replication.yaml
│ ├── ml_models/
│ │ ├── train_model.py
│ │ └── serve_predictions.py
│ ├── exports/
│ │ └── export_customers.py
│ └── schedules/
│ └── daily_pipeline.py
│
├── dagster_cloud.yaml
├── .gitignore
└── README.md
name: "Dagster Cloud — Deploy"
on:
push:
branches: [main]
paths:
# Only trigger when Dagster project files change
- "my_dagster_project/**"
- "dagster_cloud.yaml"
pull_request:
branches: [main]
paths:
- "my_dagster_project/**"
- "dagster_cloud.yaml"
env:
DAGSTER_CLOUD_ORGANIZATION: ${{ secrets.DAGSTER_CLOUD_ORGANIZATION }}
DAGSTER_CLOUD_API_TOKEN: ${{ secrets.DAGSTER_CLOUD_API_TOKEN }}
# Dagster project location within the repo
DAGSTER_PROJECT_DIR: "my_dagster_project"
jobs:
# --------------------------------------------------------------------------
# Job 1: Test Dagster definitions load (runs on PRs and pushes)
# --------------------------------------------------------------------------
dagster-validate:
name: "Validate Dagster definitions"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Install uv
uses: astral-sh/setup-uv@v4
- name: Install dependencies
working-directory: ${{ env.DAGSTER_PROJECT_DIR }}
run: uv sync
- name: Compile dbt manifest (if using colocated dbt)
working-directory: ${{ env.DAGSTER_PROJECT_DIR }}
env:
# Pass warehouse credentials for dbt parse (manifest compilation)
SNOWFLAKE_ACCOUNT: ${{ secrets.SNOWFLAKE_ACCOUNT }}
SNOWFLAKE_USER: ${{ secrets.SNOWFLAKE_USER }}
SNOWFLAKE_PASSWORD: ${{ secrets.SNOWFLAKE_PASSWORD }}
run: |
# Only needed if using DbtProjectComponent (Option A).
# Skip this step if using dbt Cloud API integration (Option B).
uv run dg utils refresh-defs-state || echo "No dbt state to refresh (using dbt Cloud API mode)."
- name: Validate definitions load
working-directory: ${{ env.DAGSTER_PROJECT_DIR }}
run: uv run dg list defs
- name: Run Dagster asset checks (dry run)
working-directory: ${{ env.DAGSTER_PROJECT_DIR }}
run: |
uv run python -c "
from dagster import Definitions
from my_dagster_project.defs import defs
print(f'Loaded {len(list(defs.get_asset_graph().all_asset_keys))} assets')
print('All definitions validated successfully.')
"
# --------------------------------------------------------------------------
# Job 2: Run Python tests for Dagster code (runs on PRs)
# --------------------------------------------------------------------------
dagster-tests:
name: "Run Dagster tests"
runs-on: ubuntu-latest
if: github.event_name == 'pull_request'
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Install uv
uses: astral-sh/setup-uv@v4
- name: Install dependencies (with test extras)
working-directory: ${{ env.DAGSTER_PROJECT_DIR }}
run: uv sync --extra dev
- name: Run tests
working-directory: ${{ env.DAGSTER_PROJECT_DIR }}
run: uv run pytest tests/ -v
# --------------------------------------------------------------------------
# Job 3: Deploy to Dagster Cloud — Branch Deployment (PRs)
# --------------------------------------------------------------------------
dagster-branch-deploy:
name: "Dagster Branch Deployment"
runs-on: ubuntu-latest
if: github.event_name == 'pull_request'
needs: [dagster-validate]
steps:
- uses: actions/checkout@v4
- name: Deploy to Dagster Cloud (branch)
uses: dagster-io/dagster-cloud-action/actions/serverless_branch_deploy@v2
with:
dagster-cloud-api-token: ${{ secrets.DAGSTER_CLOUD_API_TOKEN }}
organization: ${{ secrets.DAGSTER_CLOUD_ORGANIZATION }}
location-name: "my-dagster-project"
deployment: "prod"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# --------------------------------------------------------------------------
# Job 4: Deploy to Dagster Cloud — Production (merge to main)
# --------------------------------------------------------------------------
dagster-prod-deploy:
name: "Dagster Production Deploy"
runs-on: ubuntu-latest
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
needs: [dagster-validate]
steps:
- uses: actions/checkout@v4
- name: Deploy to Dagster Cloud (production)
uses: dagster-io/dagster-cloud-action/actions/serverless_prod_deploy@v2
with:
dagster-cloud-api-token: ${{ secrets.DAGSTER_CLOUD_API_TOKEN }}
organization: ${{ secrets.DAGSTER_CLOUD_ORGANIZATION }}
location-name: "my-dagster-project"
deployment: "prod"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}