BACPAC import failure fix

It is a common task to move databases between a Microsoft-managed environment and a self-service machine. From Azure SQL we can only take a BACPAC export, which needs to be imported/converted into Microsoft SQL Server format. With the recent changes and improvements of security, you may be facing an error message when trying to move the database backups. Let’s have a look at the BACPAC import and export failure fix to address the following error message:

A connection was successfully established with the server, but then an error occurred during the login process. (provider: SSL Provider, error: 0 – The certificate chain was issued by an authority that is not trusted.)

The tool to process the BACPAC file is called SqlPackage. You should install the latest DAC version first for to have the benefit of fixed and performance improvements.

Example of a BACPAC export from your MSSQL DB:

.\SqlPackage.exe /Action:export /ssn:SourceServerName /sdn:AXDB_source /tf:F:\Backup\AXDB.bacpac /p:CommandTimeout=1200 /p:VerifyFullTextDocumentTypesSupported=false /SourceTrustServerCertificate:true

Example of a BACPAC import into your MSSQL DB:

.\SqlPackage.exe /Action:Import /sf:C:\DynamicsTools\Sandbox.bacpac /tsn:TargetServerName /tdn:AXDBname /p:CommandTimeout=0 /TargetTrustServerCertificate:true

As you can see, now you have to enforce trusting the server certificates to pass the error message above.

Archiving SQL database backups using Azure blob storage

It is a good practice to keep multiple copies of our most precious data. By using on-premises SQL Server databases for AX 2012 or Dynamics 365 Finance and Operations, archiving SQL database backups to offsite-locations are a must. I have built automation for archiving SQL database backups using Azure Blob Storage.

Overview of the processes

Maintenance regime

Our maintenance regime looks like the following:

  • 1x Weekly Full backup
  • 6x Daily Differential backup
  • 15 minute Transactional log backups

They are captured locally on the primary SQL instance, to keep the timestamps for last successful backups in our AlwaysOn cluster. Then we move the files to a shared network storage, which is visible to both High Availability sites, in case there is an outage and we need to a fail over and restore data.

In case of a natural disaster due to the close geographical proximity of the sites we needed an additional layer of safety.

Archiving offsite

Every night we are running a PowerShell script that uses the AzCopy utility. It is uploading our backup files on a Microsoft Azure cloud storage account.

You are paying for the network usage (IO) and the size occupied on the disks, so it is a good idea to have some sort of housekeeping. Our solution was to use an Azure RunBook to determine what to keep and what to delete. The current setup is to have 1 full backup file for a year available every quarter (4x 400 GB), and keep all full / differential / transactional files for the last month (4x 400 GB + 26x 10 GB).

This puts the total size occupied around 4 TB, and costs about 35 GBP a month using a cold storage. This price could go up if you also want to utilize a hot tier storage for the latest backup set. That is useful if you want to come back from the cloud on a short notice.


Developer VM performance for MSDyn365FO

When you are used to snappy desktops or locally hosted virtual machines and suddenly you need to move to the cloud, you would expect to see similar capabilities for a reasonable price. Unfortunately that was not my experience when it comes to deploying AX 8.1 in Azure cloud. This post is about setting expectations straight for MSDyn365FO developer VM performance hosted in the cloud vs. locally.

First of all, when you deploy your developer VM on the cloud, you have two options. Standard and Premium, which is a reference to what kind of storage will be allocated to your machine. The default size is D13 (8 cores, 56 GB RAM) for regular rotating hard disks, and you need to use the DS-prefixed computers for the flash storage with memory-optimized compute. You can read up about them on these pages in more detail on the Microsoft Docs site:

Premium disk tiers
Virtual machine sizing

When it comes to premium performance, the Virtual Machine template that Microsoft has built for the LifeCycle Services deployment is using the following structure:

Operating SystemP1050025 MB/sec (50K reads)
Services and DataP202300150 MB/sec (64K reads)
Temporary BuildP1050025 MB/sec (50K reads)


Working with Entity Store and DIXF in AX

There is an excellent tool available for advanced reporting and BI capabilities that partners and customers should start exploring. In this post I would like to show how to work with Entity Store and DIXF in AX, which has been published shortly after the Technical Conference held in Seattle last year:

Working with the Entity Store and DIXF in AX takes some time to set up, but once it is running, it is acting as an up-to-date Data Warehouse for you, where you could utilize advanced features such as Clustered Columnstore Indexes, and make truly amazing visualizations which will please the eyes of your managers whom loves KPIs, and will make your teams more productive by knowing what to look for.

You can find the Public PowerBI built for the stock and items at JJ Food Service as an example of what could be achieved for visualizations built on data that is coming out of your AX environment:



By |2017-06-07T11:36:34+02:00June 7th, 2017|Categories: AX 2012|Tags: , , , , , , , , |4 Comments
Go to Top