Dynamics 365

Dynamics Virtual Community Summit

This year the usual AXUG Summit Europe has been delayed due to the Coronavirus pandemic, and is happening online now this week, called Dynamics Virtual Community Summit. You may join a lot of interesting sessions using the conference website.

Dynamics Community Virtual Summit

I will be participating in 3 sessions:

  1. Pushing the Limits of D365FO – An Enterprise-scale Data Upgrade
    2020-07-01 11:00-12:00 CET

    How do you make a decision about a green-field, or an upgrade implementation? What gold are you leaving behind if you do not get your data in the cloud? JJ Food Service has been on a journey upgrading their AX 2012 R3 environment to Dynamics 365 for Finance and Operations, with 15 years of historical data and a 2 TB database size, rolled forward since the beginning of time. Join our technical expert, Vilmos Kintera (Business Applications MVP) on this session to get first-hand experience of the challenges, tasks and logistics involved to be cloud-ready.

    This is a slightly updated version of the webinar available on the Dynamics Zero-to-Hero YouTube channel.

  2. What is the Latest in D365FO Application Development?
    2020-07-02 12:15-13:15 CET

    Learn about the latest and greatest changes around Development and Application Life-cycle Management for Microsoft Dynamics 365 for Finance, Operations / Supply Chain Management. We will discuss recent and upcoming changes.

  3. Ask the Experts: A Roundtable for Finance and Operations Questions and Concerns
    2020-07-03 15:30-16:00 CET

    Join us for this roundtable session with finance and operations experts to discuss and answer questions that deal your most pressing issues. Some topics for conversation may include:   local roll outs in various countries,  data migration considerations,  and more!

Interpreting compiler results in D365FO using PowerShell

When you build your code, the results are hard to interpret and are being capped at 1000 entries per category within the Visual Studio error pane. The compiler does generate output files with more valuable content within each package. We have written PowerShell code for analyzing and interpreting compiler results of Microsoft Dynamics 365 for Finance and Operations in a more meaningful way.

The BuildModelResult.LOG and XML files have the details of your errors, warnings and tasks. Running the following script parses these files and counts the warnings and errors, to get a better idea of the remaining work during your implementation and upgrade:

$displayErrorsOnly = $false # $true # $false
$rootDirectory = "C:\AOSService\PackagesLocalDirectory"

$results = Get-ChildItem -Path $rootDirectory -Filter BuildModelResult.log -Recurse -Depth 1 -ErrorAction SilentlyContinue -Force
$totalErrors = 0
$totalWarnings = 0

foreach ($result in $results)
{
    try
    {
        $errorText = Select-String -LiteralPath $result.FullName -Pattern ^Errors: | ForEach-Object {$_.Line}
        $errorCount = [int]$errorText.Split()[-1]
        $totalErrors += $errorCount

        $warningText = Select-String -LiteralPath $result.FullName -Pattern ^Warnings: | ForEach-Object {$_.Line}
        $warningCount = [int]$warningText.Split()[-1]
        $totalWarnings += $warningCount

        if ($displayErrorsOnly -eq $true -and $errorCount -eq 0)
        {
            continue
        }

        Write-Host "$($result.DirectoryName)\$($result.Name) " -NoNewline
        if ($errorCount -gt 0)
        {
            Write-Host " $errorText" -NoNewline -ForegroundColor Red
        }
        if ($warningCount -gt 0)
        {
            Write-Host " $warningText" -ForegroundColor Yellow
        }
        else
        {
            Write-Host
        }
    }
    catch
    {
    Write-Host
    Write-Host "Error during processing"
    }
}

Write-Host "Total Errors: $totalErrors" -ForegroundColor Red
Write-Host "Total Warnings: $totalWarnings" -ForegroundColor Yellow

The compiler results are displayed in the following format as an overview:

compiler results

If you want to do a detailed analysis, we also have PowerShell scripts prepared for extracting the errors and saving them in a CSV file for better processing. Opening it with Excel allows you to format them into a table.

We typically copy all error texts to a new sheet, run the duplicate entries removal, then do a count on what is the top ranking error to see if we have any low-hanging fruits to be fixed.

=SUMPRODUCT(('JADOperation-Warning'!$D$2:$D$1721=Sheet1!A2)+0)

You could be very efficient about deciding what things to fix first and next, and is easier to delegate tasks this way.

Source code for all four scripts are available on GitHub.

Archiving SQL database backups using Azure blob storage

It is a good practice to keep multiple copies of our most precious data. By using on-premises SQL Server databases for AX 2012 or Dynamics 365 Finance and Operations, archiving SQL database backups to offsite-locations are a must. I have built automation for archiving SQL database backups using Azure Blob Storage.

Overview of the processes

Maintenance regime

Our maintenance regime looks like the following:

  • 1x Weekly Full backup
  • 6x Daily Differential backup
  • 15 minute Transactional log backups

They are captured locally on the primary SQL instance, to keep the timestamps for last successful backups in our AlwaysOn cluster. Then we move the files to a shared network storage, which is visible to both High Availability sites, in case there is an outage and we need to a fail over and restore data.

In case of a natural disaster due to the close geographical proximity of the sites we needed an additional layer of safety.

Archiving offsite

Every night we are running a PowerShell script that uses the AzCopy utility. It is uploading our backup files on a Microsoft Azure cloud storage account.

You are paying for the network usage (IO) and the size occupied on the disks, so it is a good idea to have some sort of housekeeping. Our solution was to use an Azure RunBook to determine what to keep and what to delete. The current setup is to have 1 full backup file for a year available every quarter (4x 400 GB), and keep all full / differential / transactional files for the last month (4x 400 GB + 26x 10 GB).

This puts the total size occupied around 4 TB, and costs about 35 GBP a month using a cold storage. This price could go up if you also want to utilize a hot tier storage for the latest backup set. That is useful if you want to come back from the cloud on a short notice.

(more…)

Dynamics Zero to Hero introduction vlog

As it has been announced last week the Dynamics Zero to Hero channel is now up and running on YouTube. The brief introduction video is now up, and there are more to follow in the upcoming days and weeks:

Being nervous does show, and it does not help with my accent :)
I am sure the content is going to make up for the lack of vlogging experience on the long run, hopefully you will like it.

If you have topics you would like me to touch up on, feel free to leave a comment, or send an e-mail.

Plans for 2020 and going forward

It has been a quiet couple of months as the community has probably noticed by the lower number of blog posts and forum answers. Working on our MSDyn365FO upgrade at JJ Food Service has been a demanding task, but progress is steady. Now it is time to do the plans for 2020.

Soon I can get back to publishing blog articles on a regular basis. Not just that, but the big announcement is: DAXRunBase is going online with Video Logs!

Plans for 2020 - vlog

Make sure you bookmark and subscribe for the Dynamics Zero-To-Hero channel.

(more…)

Go to Top