#MSDyn365FO

App Checker BaseX error

The Microsoft Dynamics Application Checker is an excellent tool, which utilizes the BaseX XML-parsing software. Since all our objects and source code for Microsoft Dynamics 365 are represented as XML files, this presents an excellent opportunity to do XPath parsing. As a prerequisite for the software first, before you could include it in your build routine. We have seen an App Checker BaseX error like this:

Error: Could not create the Java Virtual Machine. Error: A fatal exception has occurred. Program will exit. Invalid maximum heap size: -Xmx15000m The specified size exceeds the maximum representable size.

When you deploy BaseX, it is recommended to increase the usable memory size in order to fit the entire Dynamics repository. Increasing the memory usage can result in the above error message. The solution is to deploy the 64-bit Java Runtime Environment, which makes the App Checker BaseX error disapper.

You can learn more about the tool here:

https://github.com/microsoft/Dynamics365FO-AppChecker

I also have blogged about it before on how to write custom scripts to traverse in BaseX. Have a look at it here.

App Checker BaseX error

Incorrect auto upgrade on custom methods

During our upgrade journey from AX 2012 to Microsoft Dynamics 365 Finance and Operations we have noticed some problems. The automated code upgrade tool available on LifeCycle Services for your modelstore did an incorrect auto upgrade on custom methods.

When we did the first builds, the following error message was showing up for various code pieces:

Compile Fatal Error: Table dynamics://Table/CCMTmpQuestions: [(39,5),(39,6)]: Unexpected token '/' specified outside the scope of any class or model element.

After checking the source code we have identified that somehow it has inserted extra lines at the end of some table methods, with a single / sign in them:

incorrect auto upgrade on custom methods

The quickest way to locate all those methods are to use our favorite file manager and search tool, Total Commander.

You can press <ALT>+<F7> and do a file search for *.xml in your \AOSService\PackagesLocalDirectory\[YourPackageName] folder for the following Regular expression value. You must tick the RegEx checkbox:

(^    \/)$

This has revealed all incorrect files, which we could fix in bulk. You can use similar approach as above to quickly find anything in the file-based Dynamics 365 Finance and Operations code repository.

Once you quick-replace these characters with a blank line, you are done fixing the incorrect auto upgrade on custom methods.

Interpreting compiler results in D365FO using PowerShell

When you build your code, the results are hard to interpret and are being capped at 1000 entries per category within the Visual Studio error pane. The compiler does generate output files with more valuable content within each package. We have written PowerShell code for analyzing and interpreting compiler results of Microsoft Dynamics 365 for Finance and Operations in a more meaningful way.

The BuildModelResult.LOG and XML files have the details of your errors, warnings and tasks. Running the following script parses these files and counts the warnings and errors, to get a better idea of the remaining work during your implementation and upgrade:

$displayErrorsOnly = $false # $true # $false
$rootDirectory = "C:\AOSService\PackagesLocalDirectory"

$results = Get-ChildItem -Path $rootDirectory -Filter BuildModelResult.log -Recurse -Depth 1 -ErrorAction SilentlyContinue -Force
$totalErrors = 0
$totalWarnings = 0

foreach ($result in $results)
{
    try
    {
        $errorText = Select-String -LiteralPath $result.FullName -Pattern ^Errors: | ForEach-Object {$_.Line}
        $errorCount = [int]$errorText.Split()[-1]
        $totalErrors += $errorCount

        $warningText = Select-String -LiteralPath $result.FullName -Pattern ^Warnings: | ForEach-Object {$_.Line}
        $warningCount = [int]$warningText.Split()[-1]
        $totalWarnings += $warningCount

        if ($displayErrorsOnly -eq $true -and $errorCount -eq 0)
        {
            continue
        }

        Write-Host "$($result.DirectoryName)\$($result.Name) " -NoNewline
        if ($errorCount -gt 0)
        {
            Write-Host " $errorText" -NoNewline -ForegroundColor Red
        }
        if ($warningCount -gt 0)
        {
            Write-Host " $warningText" -ForegroundColor Yellow
        }
        else
        {
            Write-Host
        }
    }
    catch
    {
    Write-Host
    Write-Host "Error during processing"
    }
}

Write-Host "Total Errors: $totalErrors" -ForegroundColor Red
Write-Host "Total Warnings: $totalWarnings" -ForegroundColor Yellow

The compiler results are displayed in the following format as an overview:

compiler results

If you want to do a detailed analysis, we also have PowerShell scripts prepared for extracting the errors and saving them in a CSV file for better processing. Opening it with Excel allows you to format them into a table.

We typically copy all error texts to a new sheet, run the duplicate entries removal, then do a count on what is the top ranking error to see if we have any low-hanging fruits to be fixed.

=SUMPRODUCT(('JADOperation-Warning'!$D$2:$D$1721=Sheet1!A2)+0)

You could be very efficient about deciding what things to fix first and next, and is easier to delegate tasks this way.

Source code for all four scripts are available on GitHub.

Archiving SQL database backups using Azure blob storage

It is a good practice to keep multiple copies of our most precious data. By using on-premises SQL Server databases for AX 2012 or Dynamics 365 Finance and Operations, archiving SQL database backups to offsite-locations are a must. I have built automation for archiving SQL database backups using Azure Blob Storage.

Overview of the processes

Maintenance regime

Our maintenance regime looks like the following:

  • 1x Weekly Full backup
  • 6x Daily Differential backup
  • 15 minute Transactional log backups

They are captured locally on the primary SQL instance, to keep the timestamps for last successful backups in our AlwaysOn cluster. Then we move the files to a shared network storage, which is visible to both High Availability sites, in case there is an outage and we need to a fail over and restore data.

In case of a natural disaster due to the close geographical proximity of the sites we needed an additional layer of safety.

Archiving offsite

Every night we are running a PowerShell script that uses the AzCopy utility. It is uploading our backup files on a Microsoft Azure cloud storage account.

You are paying for the network usage (IO) and the size occupied on the disks, so it is a good idea to have some sort of housekeeping. Our solution was to use an Azure RunBook to determine what to keep and what to delete. The current setup is to have 1 full backup file for a year available every quarter (4x 400 GB), and keep all full / differential / transactional files for the last month (4x 400 GB + 26x 10 GB).

This puts the total size occupied around 4 TB, and costs about 35 GBP a month using a cold storage. This price could go up if you also want to utilize a hot tier storage for the latest backup set. That is useful if you want to come back from the cloud on a short notice.

(more…)
Go to Top