Skip to content

init

dbt-forge init scaffolds a new dbt project. Use it when you want a consistent starting structure instead of building folders and setup files by hand.

Terminal window
dbt-forge init [PROJECT_NAME] [--defaults] [--output PATH] [--dry-run] [--preset PATH] [--mesh]

The command:

  • collects configuration through interactive prompts or --defaults
  • scaffolds a dbt project directory with core files and optional setup files
  • writes a .dbt-forge.yml manifest to track generated files for future updates
  • prints the next local commands to run after the scaffold is written

Optional project name for the generated dbt project.

  • When omitted in interactive mode, the CLI prompts for it.
  • Names are slugified to lowercase with underscores (e.g., My Project becomes my_project).

Skip interactive prompts and use the default configuration:

SettingDefault value
AdapterBigQuery
Martsfinance, marketing
Packagesdbt-utils, dbt-expectations
Example modelsyes
SQLFluff configyes
CI providerGitHub Actions
Unit testsno
MetricFlowno
Snapshotno
Seedno
Exposureno
Macrono
Pre-commitno
Env configno
CODEOWNERSno

Choose the directory where the project folder should be created.

Terminal window
dbt-forge init analytics_core --defaults --output ./sandbox

Show the files that would be written without creating anything on disk. Renders a tree view of the full project structure.

Apply a preset YAML file to pre-fill or lock prompt selections. Accepts a local file path or HTTPS URL.

Terminal window
dbt-forge init my_project --preset company-standard.yml
dbt-forge init my_project --preset https://example.com/presets/standard.yml

The preset is validated before prompts begin. If validation fails, init exits without scaffolding. See preset for the file format.

When combined with --defaults, preset values are applied but no prompts are shown. When used interactively, locked fields are skipped and default fields pre-populate the prompt selections.

Scaffold a dbt Mesh (multi-project) setup instead of a single project. Creates multiple interconnected sub-projects, each with its own dbt_project.yml, models with access controls, and dependencies.yml for cross-project references.

Terminal window
dbt-forge init my_mesh --mesh
dbt-forge init my_mesh --mesh --defaults # preset: staging → transform → marts

When used interactively, prompts for:

  • Project name and adapter
  • Sub-project setup: preset layout (staging → transform → marts) or custom definitions
  • Per sub-project: name, purpose, upstream dependencies

Each sub-project gets:

  • dbt_project.yml — project configuration
  • models/_groups.yml — group definition for access control
  • dependencies.yml — cross-project refs (if upstream deps exist)
  • Example models with access levels: staging=protected, intermediate=private, marts=public
  • Public models get contract: { enforced: true }
  • profiles/profiles.yml — adapter-aware connection stub

The mesh root gets a README.md and a Makefile for orchestrated deps, build, test, and clean across all sub-projects.

After scaffolding, add new sub-projects with dbt-forge add project.

Without --defaults, init asks for the following in order:

PromptFieldGenerated files
Project nameproject_nameUsed as the directory name and in dbt_project.yml
Warehouse adapteradapterprofiles/profiles.yml with adapter-specific config
Marts to scaffoldmartsmodels/marts/<name>/ and models/intermediate/<name>/ per mart
Starter packagespackagesEntries in packages.yml
Example models and testsadd_examplesmodels/staging/example_source/, tests/assert_positive_total_amount.sql, mart SQL/YAML
SQLFluff configadd_sqlfluff.sqlfluff, .sqlfluffignore
CI providersci_providers.github/workflows/dbt_ci.yml, .gitlab-ci.yml, or bitbucket-pipelines.yml
Unit test examplesadd_unit_teststests/unit/test_stg_example.yml (only if examples enabled)
MetricFlow examplesadd_metricflowmodels/marts/semantic_models/sem_orders.yml
Example snapshotadd_snapshotsnapshots/example_snapshot.sql
Example seedadd_seedseeds/example_seed.csv, seeds/_example_seed__seeds.yml
Example exposureadd_exposuremodels/marts/__example__exposures.yml
Example macroadd_macromacros/example_macro.sql
Pre-commit hooksadd_pre_commit.pre-commit-config.yaml, .editorconfig
Environment configadd_env_config.env.example, macros/generate_schema_name.sql
Team ownerteam_ownerCODEOWNERS with mart-based ownership mapping

Every scaffold includes these core files:

  • dbt_project.yml — project configuration
  • pyproject.toml — Python dependencies (dbt adapter)
  • profiles/profiles.yml — adapter-aware connection profile using env_var()
  • packages.yml — selected dbt packages with pinned version ranges
  • selectors.yml — dbt selector definitions
  • .env — sets DBT_PROFILES_DIR=./profiles for local dbt commands
  • .gitignore — excludes target/, dbt_packages/, logs/, .env
  • README.md — project documentation with adapter-specific setup instructions
  • macros/README.md — placeholder for macro documentation
  • .dbt-forge.yml — manifest tracking generated files (used by dbt-forge update)

Optional files depend on the prompts answered:

FeatureFiles generated
SQLFluff.sqlfluff, .sqlfluffignore
Pre-commit.pre-commit-config.yaml, .editorconfig
CI — GitHub Actions.github/workflows/dbt_ci.yml
CI — GitLab CI.gitlab-ci.yml
CI — Bitbucket Pipelinesbitbucket-pipelines.yml
Environment config.env.example, macros/generate_schema_name.sql
CODEOWNERSCODEOWNERS
Examplesmodels/staging/example_source/ (3 files), tests/assert_positive_total_amount.sql
Examples + martmodels/marts/<mart>/<mart>_orders.sql, models/marts/<mart>/__<mart>__models.yml, models/intermediate/<mart>/int_<mart>__orders_enriched.sql
Unit teststests/unit/test_stg_example.yml
MetricFlowmodels/marts/semantic_models/sem_orders.yml
Snapshotsnapshots/example_snapshot.sql
Seedseeds/example_seed.csv, seeds/_example_seed__seeds.yml
Exposuremodels/marts/__example__exposures.yml
Macromacros/example_macro.sql

BigQuery, Snowflake, PostgreSQL, DuckDB, Databricks, Redshift, Trino, Spark.

Each adapter generates a different profiles/profiles.yml with the correct connection fields and env_var() references.

  • --dry-run resolves the full config and project path but does not write files or create the manifest.
  • The command always prints a banner and next-step hints for the generated project.
  • If the project directory already exists, files are written into it without deleting existing content.
  • Run dbt-forge --help for a quick overview of all available commands.