Automating Dataverse Table Backups using PowerShell

Loading

Backing up Dataverse tables is essential for data security, compliance, and disaster recovery. PowerShell automation helps schedule and perform backups efficiently.

This guide covers:

Connecting to Dataverse using PowerShell
Exporting table schema and data
Saving backups to a local or cloud storage
Automating the backup process


Step 1: Prerequisites

1. Install Required PowerShell Modules

Ensure you have the necessary PowerShell modules installed:

# Install Power Platform module if not installed
Install-Module -Name Microsoft.PowerPlatform.Cds.Client -Scope CurrentUser -Force

# Import the module
Import-Module Microsoft.PowerPlatform.Cds.Client

2. Required Permissions

  • System Administrator or Dataverse Admin role is needed.
  • Ensure API access is enabled for Dataverse.

Step 2: Connecting to Dataverse

Option 1: Interactive Login

For manual execution, use:

# Connect to Dataverse interactively
$connection = Connect-CdsService -ConnectionString "AuthType=OAuth;Url=https://yourorg.crm.dynamics.com;Prompt=Login"

Option 2: Using Service Principal

For automation, use an Azure AD App Registration.

# Define credentials
$clientId = "your-app-client-id"
$clientSecret = "your-app-client-secret"
$tenantId = "your-tenant-id"
$orgUrl = "https://yourorg.crm.dynamics.com"

# Convert secret to secure string
$secureSecret = ConvertTo-SecureString $clientSecret -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($clientId, $secureSecret)

# Connect to Dataverse
$connection = Connect-CdsService -Url $orgUrl -ClientId $clientId -ClientSecret $secureSecret -TenantId $tenantId

Step 3: Exporting Dataverse Table Schema

To export table schema for backup:

# Define table name
$tableName = "account" # Replace with your table name

# Retrieve schema details
$schema = Get-CdsRecord -Connection $connection -EntityLogicalName "entity" -FilterAttribute "logicalname" -FilterOperator "eq" -FilterValue $tableName -Fields "logicalname", "displayname"

# Save schema to a JSON file
$schema | ConvertTo-Json | Out-File "C:\Backups\$tableName-Schema.json"

Write-Host "Schema backup completed: C:\Backups\$tableName-Schema.json"

The schema is now saved in JSON format.


Step 4: Exporting Dataverse Table Data

To export all records from a table:

# Define table name
$tableName = "account" # Replace with your table name

# Retrieve all records
$data = Get-CdsRecord -Connection $connection -EntityLogicalName $tableName

# Export data to CSV
$data | Export-Csv -Path "C:\Backups\$tableName-Data.csv" -NoTypeInformation

Write-Host "Data backup completed: C:\Backups\$tableName-Data.csv"

All records are now saved in a CSV file.


Step 5: Automating Backups on a Schedule

To automate backups, create a PowerShell script:

# Define parameters
$backupPath = "C:\Backups"
$tableName = "account"

# Ensure backup directory exists
if (!(Test-Path $backupPath)) {
New-Item -ItemType Directory -Path $backupPath
}

# Retrieve schema
$schema = Get-CdsRecord -Connection $connection -EntityLogicalName "entity" -FilterAttribute "logicalname" -FilterOperator "eq" -FilterValue $tableName -Fields "logicalname", "displayname"
$schema | ConvertTo-Json | Out-File "$backupPath\$tableName-Schema.json"

# Retrieve and export data
$data = Get-CdsRecord -Connection $connection -EntityLogicalName $tableName
$data | Export-Csv -Path "$backupPath\$tableName-Data.csv" -NoTypeInformation

Write-Host "Backup completed for table: $tableName"

Scheduling the Backup Using Task Scheduler

1️⃣ Save the script as BackupDataverse.ps1 in C:\Scripts\.
2️⃣ Open Task Scheduler → Click Create Basic Task.
3️⃣ Set Trigger to run Daily or Weekly at a fixed time.
4️⃣ Set Action to Start a Program → Choose powershell.exe.
5️⃣ In Arguments, add:

-ExecutionPolicy Bypass -File "C:\Scripts\BackupDataverse.ps1"

This ensures automated backups without manual intervention.


Step 6: Storing Backups in Cloud (OneDrive/Azure Blob)

Option 1: OneDrive

# Define OneDrive backup path
$oneDrivePath = "C:\Users\YourUser\OneDrive\Backups"

# Copy files to OneDrive
Copy-Item -Path "C:\Backups\*" -Destination $oneDrivePath -Recurse -Force

Write-Host "Backup uploaded to OneDrive successfully!"

Option 2: Azure Blob Storage

To store in Azure Blob Storage, first install Azure module:

# Install Azure Storage module
Install-Module -Name Az.Storage -Scope CurrentUser -Force

Upload backup files to Azure Blob Storage:

# Define Azure Storage parameters
$storageAccountName = "yourstorageaccount"
$containerName = "dataversebackups"
$storageKey = "your-storage-key"
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageKey

# Upload backup file
Set-AzStorageBlobContent -Context $context -Container $containerName -File "C:\Backups\account-Data.csv" -Blob "account-Data.csv"

Write-Host "Backup uploaded to Azure successfully!"

This securely stores backups in Azure.


Step 7: Restoring Dataverse Table Data

If data loss occurs, restore backups using:

# Define table name
$tableName = "account"

# Import data from CSV
$importData = Import-Csv -Path "C:\Backups\$tableName-Data.csv"

# Insert records back into Dataverse
foreach ($record in $importData) {
New-CdsRecord -Connection $connection -EntityLogicalName $tableName -Fields @{
"name" = $record.name
"emailaddress" = $record.emailaddress
}
}

Write-Host "Data restored successfully!"

This restores data from backups.

Leave a Reply

Your email address will not be published. Required fields are marked *