Вернуться   AXForum > Microsoft Dynamics AX > DAX Blogs
Забыли пароль?
Правила Справка Пользователи Сообщения за день Поиск Все разделы прочитаны

Опции темы Поиск в этой теме Опции просмотра
Старый 02.12.2022, 14:17   #1  
Blog bot is offline
Blog bot
25,232 / 843 (79) +++++++
Регистрация: 28.10.2006
Art Of Creation: D365 F&O: Schedule backups using DevOps pipelines and database movement API

Hi all,
We had a requirement to automatically create a backup of the database of a Tier 2 environment. Luckily there is the Database Movement API that allows you to do just that.

In his excellent blog post Automated backups of D365FO databases, Dick explains how to set this up using Release pipelines. We have done the same, but using (build) pipelines, so that’s what is documented below.

There are only minor differences. For example we use an output variable for pass the token between tasks, but for the rest everything is pretty much the same. So thank you Dick.

Creating the pipeline

  1. Go to your DevOps project > Pipelines > Pipelines, and click on New pipeline.
  2. Click on the version control system you are using, in my case it’s Team Foundation Version Control.
  3. On the Select your repository page, click on Continue
  4. On the Choose a template, click on Empty Job
  5. On the top of the page, rename your pipeline to something meaningful, like Backup environments
A new pipeline is created:

Configure variables

Before you start, you need to create an application registration and have an account. See Step 1 to Step 3 on this page: Database movement API – Authentication.

After this is done, on your pipeline, click on the Variables tab. Add the following variables:

CLIENTIDThis is the client ID from your application registrationCLIENTSECRETThis is the client secret from your application registrationGOLDENThis is the Environment Id of the environment you want to create a backup for. You can find this on the Environment details page in LCS.LCSPROJIDYou can find the project id in the URL when you are on the project in LCS. It’s an integer like 1234567.USERNAMEThe email address of the service account you will use for this operationPASSWORDThe password of the service accountIt should look something like this:

Don’t worry, all data is fake in the screenshot :).

Adding task Get token

  1. Next, go back to the Tasks tab, and click on the + button next to Agent job 1 to add a new task.
  2. Search for Powershell and click on Add to add a Powershell task.
  3. Click on your new task and name it Get token.
  4. Change the type to Inline and paste the following script:
$tokenUrl = ""
$tokenBody = @{
grant_type = "password"
client_id = "$(CLIENTID)"
client_secret = "$(CLIENTSECRET)"
resource = ""
username = "$(USERNAME)"
password = "$(PASSWORD)"
$tokenResponse = Invoke-RestMethod -Method 'POST' -Uri $tokenUrl -Body $tokenBody
$token = $tokenResponse.access_token
Write-Host $token
Write-Host "##vso[task.setvariable variable=TOKENOUT;isOutput=true]$token"

Next on the Output Variables fast tab, set the reference name to task1.
It should look like this:

Add task Backup database Golden

Next well add the task that will perform a backup of our environment. Repeat the same steps as before to add the PowerShell task but name it Backup database Golden.
Set the inline script to:

$cstzone = [System.TimeZoneInfo]::ConvertTimeBySystemTimeZoneId( (Get-Date), ‘W. Europe Standard Time’)
$filedate = Get-Date $cstzone -f “yyy-MM-dd”
$BackupName = “Golderbackup-$filedate
Write-Output $BackupName
$refreshUrl = “$(LCSPROJID)/environment/$(GOLDEN)/backupName/$BackupName
$refreshHeader = @{
Authorization = “Bearer $(task1.TOKENOUT)
“x-ms-version” =2017-09-15
“Content-Type= “application/json”
$refreshResponse = Invoke-RestMethod $refreshUrl -Method ‘POST’ -Headers $refreshHeader
Write-Output $refreshResponse

Our pipeline will look like this:

Setting up the schedule

On the Triggers tab, click on Add in the Scheduled section. Choose you schedule and unmark the checkbox Only schedule builds if the source or pipeline has changed.
Click on Save & Queue, the click on Save.

All done

That’s it, your pipeline will create a backup for this environment based on your schedule. You should see your backup in the Asset library on LCS when the pipeline has run.


  1. You can only call the pipeline 3 times in a 24 hour period, so keep that in mind. This is documented on the page name Throttling.
  2. This is just a PowerShell script, so you can use the PowerShell ISE to test your scripts. Or if you have a different method of scheduling PowerShell scripts than DevOps pipelines, that will work too.
  3. The DevOps pipeline runs in who-knows-what country so the Azure AD administrator might have to configure some access policies to allow the account to be used.
  4. You can also use the API to Create a database refresh.

Расскажите о новых и интересных блогах по Microsoft Dynamics, напишите личное сообщение администратору.

Похожие темы
Тема Автор Раздел Ответов Посл. сообщение
Rahul Sharma: Create Build Server, DevOps Branch, Pipeline for Dynamics 365 Finance and Operations Blog bot DAX Blogs 0 21.06.2020 16:13 Calling the LCS Database Movement API in your Azure DevOps pipeline Blog bot DAX Blogs 0 28.01.2020 11:11
Опции темы Поиск в этой теме
Поиск в этой теме:

Расширенный поиск
Опции просмотра

Ваши права в разделе
Вы не можете создавать новые темы
Вы не можете отвечать в темах
Вы не можете прикреплять вложения
Вы не можете редактировать свои сообщения

BB коды Вкл.
Смайлы Вкл.
[IMG] код Вкл.
HTML код Выкл.
Быстрый переход

Часовой пояс GMT +3, время: 20:12.
Powered by vBulletin® v3.8.5. Перевод: zCarot
Контактная информация, Реклама.