YAML Pipelines with Power Platform

Loading


As low-code adoption grows, so does the need for structured deployment and governance. While Microsoft Power Platform makes building apps and portals simple, it’s crucial to manage them using the same rigor as traditional development.

That’s where YAML pipelines come in.

By combining Azure DevOps with YAML-defined CI/CD pipelines, you can automate the build, test, and deployment of Power Platform components—whether you’re working with Power Apps, Power Automate, Power Pages, or Dataverse solutions.

In this article, we’ll explore how to create, configure, and optimize YAML pipelines for Power Platform, using both Microsoft’s Power Platform Build Tools and PAC CLI (Power Platform CLI).


Why YAML for Power Platform?

Traditionally, DevOps workflows in Azure DevOps were created using classic pipelines with a visual designer. YAML (Yet Another Markup Language) pipelines provide several advantages:

  • Code-based and version-controlled
  • Reusable and modular
  • Easy to replicate across environments
  • Better support for automation and governance
  • Integrates naturally with Git and pull requests

YAML is especially useful when combined with Power Platform CLI (PAC CLI) and solution lifecycle best practices.


Prerequisites

To follow along with YAML-based pipelines, ensure the following:

  • Azure DevOps project and repository
  • Power Platform environment(s): Dev, Test, Prod
  • A Service Principal (for authentication)
  • Power Platform Build Tools installed in Azure DevOps
  • Source-controlled solution files (decomposed if possible)
  • PAC CLI installed (for local testing)

Typical ALM Flow with YAML Pipelines

Your pipeline for Power Platform might include the following stages:

  1. Build: Unpack or pack a solution from source
  2. Validate: Run Solution Checker
  3. Deploy to Test: Import to staging/UAT
  4. Deploy to Production: Manual approval or scheduled release

Let’s walk through each with examples.


1. YAML Pipeline Basics

Here’s a minimal YAML pipeline that runs on commits to the main branch:

trigger:
  branches:
    include:
      - main

pool:
  vmImage: 'windows-latest'

steps:
- task: PowerPlatformToolInstaller@0
- task: PowerPlatformExportSolution@0
  inputs:
    authenticationType: 'PowerPlatformSPN'
    PowerPlatformSPN: 'MyPowerPlatformServiceConnection'
    solutionName: 'ContosoSolution'
    solutionOutputFile: '$(Build.ArtifactStagingDirectory)/ContosoSolution.zip'

This will export a solution from your dev environment using a service connection.


2. Using Service Principal for Authentication

Create a Service Principal in Azure AD and register it in Azure DevOps as a Service Connection:

  • Go to Project Settings → Service Connections
  • Add new → Power Platform → Choose Service Principal
  • Provide client ID, secret, tenant ID, and environment URL

Use this connection name in all YAML tasks with PowerPlatformSPN.


3. Building Your Pipeline

a. Export, Unpack, and Source Control the Solution

steps:
- task: PowerPlatformExportSolution@0
  inputs:
    authenticationType: 'PowerPlatformSPN'
    PowerPlatformSPN: 'MyPowerPlatformServiceConnection'
    solutionName: 'ContosoSolution'
    solutionOutputFile: '$(Build.ArtifactStagingDirectory)/ContosoSolution.zip'
    exportAs: 'Unmanaged'

- task: PowerPlatformUnpackSolution@0
  inputs:
    solutionInputFile: '$(Build.ArtifactStagingDirectory)/ContosoSolution.zip'
    targetFolder: '$(Build.SourcesDirectory)/solutions/ContosoSolution'
    overwriteFiles: true

This unpacks the solution to a folder suitable for Git storage and collaboration.


4. Solution Checker Integration

Integrate quality gates using Solution Checker:

- task: PowerPlatformChecker@0
  inputs:
    authenticationType: 'PowerPlatformSPN'
    PowerPlatformSPN: 'MyPowerPlatformServiceConnection'
    solutionInputFile: '$(Build.ArtifactStagingDirectory)/ContosoSolution.zip'
    ruleSet: '0ad12346-e108-40b8-a956-9a8f95ea18c9' # Default ruleset

You can block deployment if critical issues are found using pipeline conditions.


5. Importing to Target Environments

To deploy to a UAT or production environment:

- task: PowerPlatformImportSolution@0
  inputs:
    authenticationType: 'PowerPlatformSPN'
    PowerPlatformSPN: 'MyProdServiceConnection'
    solutionInputFile: '$(Build.ArtifactStagingDirectory)/ContosoSolution.zip'
    overwriteUnmanagedCustomizations: true
    publishWorkflows: true

You can chain deployments using stages:

stages:
- stage: Build
  jobs:
    - job: ExportAndPack
- stage: DeployUAT
  dependsOn: Build
  jobs:
    - job: ImportToUAT
- stage: DeployProd
  dependsOn: DeployUAT
  condition: succeeded('DeployUAT')
  jobs:
    - job: ImportToProd

Use manual approvals in Azure DevOps to control production releases.


6. Using PAC CLI in YAML Pipelines

Sometimes, you need more flexibility than the built-in tasks. PAC CLI gives you more control:

- script: |
    pac auth create --url https://org.crm.dynamics.com --clientId $(clientId) --clientSecret $(clientSecret) --tenant $(tenantId)
    pac solution import --path $(Build.ArtifactStagingDirectory)/ContosoSolution.zip
  displayName: 'Import Solution with PAC CLI'

Define secrets like clientId, clientSecret, and tenantId as pipeline variables or secret variables in your DevOps library.


7. Power Pages and Portal Content in Pipelines

To manage Power Pages (portal) content:

a. Download from Dev Environment

- script: |
    pac auth create --url $(devUrl) --clientId $(clientId) --clientSecret $(clientSecret) --tenant $(tenantId)
    pac paportal download --path ./MyPortal --websiteId $(portalId)
  displayName: 'Download Power Pages'

b. Upload to Target Environment

- script: |
    pac auth create --url $(testUrl) --clientId $(clientId) --clientSecret $(clientSecret) --tenant $(tenantId)
    pac paportal upload --path ./MyPortal
  displayName: 'Deploy Power Pages Content'

Include this alongside your solution deployments for end-to-end portal ALM.


8. Managing Data in Pipelines

You can also move configuration data (lookups, reference tables) between environments:

- script: |
    pac data export --environment "$(devUrl)" --schemafile schema.json --datafile data.zip
    pac data import --environment "$(uatUrl)" --datafile data.zip
  displayName: 'Move Configuration Data'

Use schema files to control what data is included.


9. Using Artifacts for Pipeline Outputs

Store solution zip files, test results, and logs:

- task: PublishBuildArtifacts@1
  inputs:
    PathtoPublish: '$(Build.ArtifactStagingDirectory)'
    ArtifactName: 'PowerPlatformBuild'
    publishLocation: 'Container'

This allows downstream stages or release pipelines to use the same artifacts.


10. Add Quality Gates and Policies

You can add policies to prevent code from merging unless:

  • The build passes
  • Solution Checker has no critical issues
  • Approvals are granted (for prod)
  • Branch protection rules are satisfied

This makes low-code deployment as rigorous as any other engineering practice.


11. YAML Pipeline Best Practices

Best PracticeDescription
Modular pipelinesBreak down into jobs or stages for flexibility
Secure secretsUse DevOps secret variables or Azure Key Vault
Use templatesCreate reusable YAML templates for different apps
Monitor resultsEnable pipeline notifications and dashboards
Validate before deployUse pac solution checker or custom test scripts
Track versionsVersion your solutions and update metadata
Avoid manual stepsAutomate everything except final production approval

12. Real-World Example: Multi-Environment Power Platform Pipeline

Contoso manages a customer service app and portal with the following flow:

  1. Developers commit to the dev branch
  2. CI pipeline runs:
    • Exports solution from dev
    • Runs Solution Checker
    • Uploads portal changes
    • Publishes artifacts
  3. QA pipeline picks artifacts and deploys to UAT
  4. After testing, an approver triggers the Prod pipeline
  5. All logs and versions are stored with Git commits

This automated flow ensures consistency, traceability, and security for every change.


Leave a Reply

Your email address will not be published. Required fields are marked *