๐Ÿ‘ค Who is this for?

IT Admin BI Developer Platform Owner โ€” This section covers CI/CD deployment patterns, Git integration, and step-by-step migration strategies from Synapse, ADF, Power BI Premium, and on-premises systems.

DevOps

Deployment Patterns

CI/CD, Git integration, and lifecycle management for Fabric artifacts.

Development Lifecycle

A well-structured deployment strategy ensures reliability and consistency across environments. Microsoft Fabric supports a full Dev โ†’ Test โ†’ Prod lifecycle with built-in tools.

Deployment Pipeline
๐Ÿ’ป
Develop
Code, build, and test in dev workspace
โ†’
๐Ÿ”€
Git Commit
Push to feature branch, create pull request
โ†’
โœ…
Test / QA
Validate in staging workspace
โ†’
๐Ÿš€
Production
Deploy via pipeline to prod workspace

Fabric Deployment Pipelines

Deployment pipelines are Fabric's built-in tool for managing the lifecycle of your artifacts (lakehouses, warehouses, reports, notebooks). They let you promote artifacts through stages with comparison and selective deployment.

Git Integration

Fabric supports direct integration with Azure DevOps and GitHub repositories. This enables version control, code review, and collaboration workflows for Fabric artifacts.

โœ… Best Practice

Use Git integration for development (version control, branching, PRs) and deployment pipelines for promotion (Dev โ†’ Test โ†’ Prod). They complement each other.

Git Integration Setup

CI/CD Patterns

PatternDescriptionRecommended For
Pipeline-onlyUse Fabric deployment pipelines exclusivelySmall teams, simpler workflows
Git + PipelinesGit for source control + pipelines for promotionMost teams (recommended)
Full CI/CDAzure DevOps / GitHub Actions + Fabric REST APIsEnterprise teams with strict governance
Python CI/CD (fabric-cicd)Code-first deployments with Python SDK + CI/CD pipelinesTeams wanting programmatic, deterministic deployments

Python CI/CD with fabric-cicd

The fabric-cicd library is Microsoft's open-source Python SDK for automating Fabric workspace deployments. It provides deterministic, code-first deployments that integrate seamlessly with GitHub Actions and Azure DevOps pipelines.

Installation

Install via pip
pip install fabric-cicd

Key Capabilities

Deployment Example

Python โ€” Deploy Fabric items with fabric-cicd
from fabric_cicd import FabricWorkspace, publish_all_items
from azure.identity import ClientSecretCredential

# Authenticate with service principal
credential = ClientSecretCredential(
    tenant_id="your-tenant-id",
    client_id="your-client-id",
    client_secret="your-client-secret"
)

# Define the target workspace
workspace = FabricWorkspace(
    workspace_id="your-workspace-id",
    environment="PROD",                    # Maps to parameter.yml values
    repository_directory="./fabric-items", # Local repo directory with item definitions
    item_type_in_scope=["Notebook", "DataPipeline", "SemanticModel", "Report"],
    credential=credential
)

# Deploy all items
publish_all_items(workspace)

GitHub Actions Integration

YAML โ€” GitHub Actions workflow for Fabric deployment
name: Deploy to Fabric
on:
  push:
    branches: [main]

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Set up Python
        uses: actions/setup-python@v5
        with:
          python-version: '3.11'

      - name: Install fabric-cicd
        run: pip install fabric-cicd

      - name: Deploy to Production
        env:
          AZURE_TENANT_ID: ${{ secrets.AZURE_TENANT_ID }}
          AZURE_CLIENT_ID: ${{ secrets.AZURE_CLIENT_ID }}
          AZURE_CLIENT_SECRET: ${{ secrets.AZURE_CLIENT_SECRET }}
        run: python scripts/deploy.py --env PROD
โœ… Best Practice

Use service principals (not user accounts) for CI/CD automation. Create a dedicated app registration in Entra ID, grant it workspace Contributor access, and store credentials as CI/CD secrets. Use parameter.yml to parameterize environment-specific values so the same codebase deploys to Dev, Test, and Prod.

Environment Parameterization

YAML โ€” parameter.yml example
find_replace:
  - find: "your-dev-lakehouse-id"
    replace_with:
      DEV: "dev-lakehouse-id"
      TEST: "test-lakehouse-id"
      PROD: "prod-lakehouse-id"
  - find: "your-dev-connection"
    replace_with:
      DEV: "dev-connection-string"
      TEST: "test-connection-string"
      PROD: "prod-connection-string"

Environment Separation

Migration

Migration Strategies

Moving from existing platforms to Microsoft Fabric with minimal risk.

Migration Decision Framework

Migration is not one-size-fits-all. Use a phased approach to reduce risk and ensure continuity:

๐Ÿ”„ Interactive Migration Path Recommender

Select your current platform to see a step-by-step tailored migration path to Fabric.

What are you migrating from?
๐Ÿ”ท
Azure Synapse Analytics
Dedicated SQL pools, Spark pools, Synapse pipelines
๐Ÿ”ถ
Azure Data Factory
ADF pipelines, linked services, data flows
๐Ÿ“Š
Power BI Premium (P-SKU)
Existing P1โ€“P5 capacity with reports & datasets
๐Ÿงฑ
Databricks
Spark workspaces, Delta tables, MLflow
๐Ÿข
On-Premises (SQL Server / SSIS)
SQL Server databases, SSIS packages, SSRS reports
Phased Migration Approach
๐Ÿ”
Assess
Inventory current workloads, dependencies, costs
โ†’
๐Ÿงช
Pilot
Migrate 1-2 non-critical workloads to validate
โ†’
๐Ÿ”„
Migrate
Iteratively move workloads in priority order
โ†’
๐Ÿ“ˆ
Optimize
Tune performance, retire legacy services

Migration Paths

From Azure Synapse Analytics

From Azure Data Factory (ADF)

From Power BI Premium

From On-Premises (SQL Server / SSIS)

๐Ÿ› ๏ธ Official Migration Tools & Resources

Select your source platform to see the official Microsoft tools, step-by-step guidance, and current migration support status.

๐Ÿ”ท

Azure Synapse Analytics

Dedicated SQL pools, Spark pools, Synapse pipelines

Official Tools Available
๐Ÿ”ถ

Azure Data Factory

ADF pipelines, linked services, data flows

Migration Assistant Available
๐Ÿ“Š

Power BI Premium (P-SKU)

P1โ€“P5 capacity with reports, datasets, dashboards

Guided Migration Path
๐Ÿข

On-Premises SQL / SSIS

SQL Server databases, SSIS packages, SSRS reports

Partial Tooling
๐Ÿงฑ

Databricks

Spark workspaces, Delta tables, MLflow experiments

Manual Migration
โ„๏ธ

Snowflake

Snowflake warehouses, stages, stored procedures

Mirroring + Manual
โš ๏ธ Common Migration Pitfalls

1. Doing a "big bang" migration โ€” always migrate incrementally. 2. Ignoring data quality issues in the source โ€” clean data before or during migration. 3. Not testing performance with production-scale data. 4. Forgetting to migrate security policies and access controls. 5. Underestimating change management โ€” train your team!

๐Ÿ“Š Looking for capacity sizing?

Capacity planning, SKU selection, cost optimization, and TCO/ROI calculator have moved to the Capacity Planning page.