Tier-1(CHE) and Unified Developer Experience (UDE) environments build error: Another build is in progress.

In some cases you may face an issue with a model or project build in a Tier-1 environment.The error can be "another x++ build is currently running" or "another build is in progress". 

I guess, advice on how to solve this problem can be easily found on the Internet. For example, you can restart your virtual machine or kill the build process (xppcAgent) manually and try again. 

I believe it would be great to know the reason for this issue. In Visual Studio 2022, there is a setting called "Build Modules in Parallel". If it is enabled, you might constantly face the issue mentioned above.

So, it makes sense to check and deactivate this parameter in order to increase build stability in Tier-1 or Unified Developer Experience (UDE) environments.



D365 SCM warehouse mobile device development approach. Macros or constants.

As you might know there are a lot of controls and related macro commands in the WHS classes. It is not really convenient to search for macro commands in code, since they are not supported by the cross-reference feature.

In D365 Finance and Operations, there is an option that can help simplify the development of warehouse mobile flows and tracking using the existing commands. It is constants.

In my opinion, the best example of the mentioned option is the "ProcessGuideDataTypeNames" class. Inside the class you can find a lot of constants that are used in the mobile device flows. 

As you can see, macros are used at the class declaration level, but with a specific reference to a value in the WHSRF macro. As a result, you can use cross-references to find all the places in the code where the constants are used.

In general, constants have the following advantages over macros:
  • You can add a documentation comment to a constant but not to the value of a macro. Eventually, the language service will pick up this comment and provide a useful information to the user.
  • A constant is known by IntelliSense.
  • A constant is cross-referenced. Therefore, you can find all references for a specific constant but not for a macro.
  • A constant is subject to access modifiers. You can use the private, protected, and public modifiers. The accessibility of macros isn't rigorously defined.
  • Constant variables have scope, whereas macros don't have scope.
  • You can see the value of a constant or a read-only variable in the debugger.
  • Full control of the type for the constant.
  • You can also define constants as variables only. The compiler will maintain the invariant so that the value can't be modified.
  • A significant effect on the performance of the compiler.
Considering all the above, I would recommend you to use constants instead of macros in general, not only in WHS mobile device flow development.

Ax 2012 data upgrade in Tier-1 development environments(CHE). A parameter cannot be found that matches parameter name “TrustServerCertificate”.

When I ran data upgrade using Data Upgrade 10.0.41 package, I faced an issue:

Executing step: 3
GlobalUpdate script for service model: AOSService on machine: localhost
perform data upgrade, sync AX database and deploy SSRS report
A parameter cannot be found that matches parameter name 'TrustServerCertificate'.
The step failed.

On the Internet, I found that there might be a problem with a version of the SQL Server PowerShell module. When I installed the latest 22.x.x version I was able to resume the process. I performed the following steps:

Within a PowerShell prompt, I ran the following command: 
(Get-Command Invoke-SqlCmd).Module

In my case, I had 15.0 version.
In order to install the latest version I ran the command:
Install-Module -Name SqlServer -AllowClobber

When the process was completed, I ran the following PowerShell command to check the versions again: 
Get-Module -ListAvailable SqlServer, SqlPs

As a result, I saw version 22.x.x and I was able to continue with the data upgrade using the command: 
AXUpdateInstaller.exe execute -runbookid="MajorVersionDataUpgrade-runbook" -rerunstep=3

New dev tools service requirements to upgrade Tier-1 environments (CHE) to version 10.0.42 or higher.

If you are going to upgrade your developer cloud-hosted (OneBox) environments to version 10.0.42 or higher you should keep in mind that it is impossible without Visual Studio 2022 components.

If there are no Visual Studio 2022 components in your environment during the upgrade, you will see an error:

Executing step: 29
Update script for service model: DevToolsService on machine: localhost update DevTools service
The Dynamics 365 F&O Development Tools extension folder for Visual Studio was not found
The step failed.

In my opinion, there are 2 ways to solve the issue:

  • Install Visual Studio 2022 on the environment
  • Deploy a new environment with version 10.0.42 on the LCS, or if you use local machines, you can create one from the VHD with version 10.0.39 and upgrade it to 10.0.42.
You can find an official announcement here.

How to run batch tasks using the SysOperation framework

Overview

As you may know, the system has batch tasks functionality. It can be used to create a chain of operations if you want to set an order of tasks execution according to certain criteria.

Issue description


When you create a new batch tasks you have to specify the "Class name" field. If you know the class name in advance you can put it directly and not wait for the lookup list. Theoretically not all classes are supposed to be used with the mentioned feature. In this casse, the following warning message will appear: 
"The specified class is not fully designed for use in the Batch job form. Execution might provide unexpected results."

If you ignore this message and try to define the class parameters you will see the error: 
"Access denied: Class name"

If we use the "RunBaseBatch" framework it is required to override the method "canGoBatchJournal" and set its return value to "true".

If you use the new SysOperation framework it is required to decorate the controlled class with the 
"[SysOperationJournaledParametersAttribute(true)]". 

[SysOperationJournaledParameters(true)]

class WHSReleaseOutboundShipmentOrderToWarehouseController

If you add this command the issue with the warning message "The specified class is not fully designed for use in the Batch job form. Execution might provide unexpected results." will be solved.

But if you try to set the parameters you will still see the error: "Access denied: Class name".

In order to solve it, it is requerd to add to the "new" method a reference to the service class and method. You can take a look at the "WHSReleaseOutboundShipmentOrderToWarehouseController" class to get an idea of the pattern.

void new(

    IdentifierName _className  = '',
    IdentifierName _methodName = '',
    SysOperationExecutionMode _executionMode = SysOperationExecutionMode::Synchronous)
{
    IdentifierName parmClassName    = _className != '' ? _className :                 
            classStr(WHSReleaseOutboundShipmentOrderToWarehouseService);

    IdentifierName parmMethodName = _methodName != '' ? _methodName : 
        methodStr(WHSReleaseOutboundShipmentOrderToWarehouseService
            autoReleaseOutboundShipmentOrders);

    super(parmClassName, parmMethodName, _executionMode);

    this.parmDialogCaption(
        WHSReleaseOutboundShipmentOrderToWarehouseController::description());
}

When the "new" method is executed, the framework looks at the "run" method and determines the contract based on the input parameters. The call to "startOperation" method causes the system to build the user interface and execute code based on the user’s actions. A dialog is created using data types specified in data contract member methods. This explains why the new method requires class and method names. The third parameter specifies the execution mode, which takes effect if we programmatically execute the controller. Synchronous execution runs in line with the current process unless the user chooses to run it in the background.

So the reason for the error "Access denied: Class name" was that the service class names and method names were not set by default in the "new" method. As a result, the default UI builder class did not have a reference to the contract class, so parameters could not be set for batch tasks.

In case you experience other challenges with the SysOperation batch tasks feature you can use "WHSReleaseOutboundShipmentOrderToWarehouse*" classes to guide you.

Data upgrade from AX 2012 in development environments: database collation change.

Overview

As you may know the collation of the AX 2012 database must be SQL_Latin1_General_CP1_CI_AS when you perform a data upgrade in development environments. If your database is a different collation, you have to change it. Otherwise, you will experience weird errors during the data upgrade process. I would like to share my experience with this process.

Issue description

In order to change the collation of the AX 2012 database you can follow this guide: Change the database collation for development environments. When I tried to use the command from the mentioned guide:

SqlPackage.exe /Action:Export /SourceServerName:localhost /SourceDatabaseName:MicrosoftDynamicsAX /TargetFile:"C:\Temp\MicrosoftDynamicsAX.bacpac" /Properties:CommandTimeout=1200 /Properties:VerifyFullTextDocumentTypesSupported=False

The error appeared:

*** Changes to connection setting default values were incorporated in a recent release.  More information is available at https://aka.ms/dacfx-connection
*** Error exporting database:Could not connect to database server.
A connection was successfully established with the server, but then an error occurred during the login process. 
(provider: SSL Provider, error: 0 - The certificate chain was issued by an authority that is not trusted.)
The certificate chain was issued by an authority that is not trusted
*** The settings for connection encryption or server certificate trust may lead to connection failure if the server is not properly configured.

The cause of the error is that the server may not have encryption enabled or the configured certificate may not be issued from a trusted certificate authority (such as a self-signed certificate).

In order to avoid this error, I used the following guide: SqlPackage Export parameters and properties. As a result, I modified the export command given in the guide:

SqlPackage.exe /Action:Export /SourceServerName:localhost /SourceDatabaseName:MicrosoftDynamicsAX /TargetFile:"C:\Temp\MicrosoftDynamicsAX.bacpac" /Properties:CommandTimeout=4200 /Properties:VerifyFullTextDocumentTypesSupported=False /SourceTrustServerCertificate:True

The idea is to add the "/SourceTrustServerCertificate:True" command. It requires to use TLS to encrypt the source database connection and bypass walking the certificate chain to validate trust. As a result, you change the SqlPackage command to either connect without encryption or to trust the server certificate. In addition, I extended the timeout to 4.200 seconds.

When you change the “Collation” property in the model.xml file you need to import the *.bacpac file back into the database server to create a new database. Therefore, it makes sense to apply a similar change to the export command. The idea of the changes is the same: trust the server's certificate and extend the timeout.

As a result, I modified the import command given in the guide: 

SqlPackage.exe /Action:Import /SourceFile:"C:\Temp\ MicrosoftDynamicsAX.bacpac" /TargetServerName:localhost /TargetDatabaseName:MicrosoftDynamicsAX_NewCollation /Properties:CommandTimeout=4200 /ModelFilePath:"C:\Temp\model.xml" /TargetTrustServerCertificate:True

After that, I was able to import the *.bacpac file into a new database with the SQL_Latin1_General_CP1_CI_AS collation.

Ax 2012 data upgrade on the self-service environment hosted on the Europe region (eu.lcs.dynamics.com).

When you run the DataMigrationTool.exe application a console window opens where you can specify the cloud environment type:

  • Public: [ lcs.dynamics.com ]
  • GCC: [ gov.lcs.microsoftdynamics.us ]
  • UAE: [ uae.lcs.dynamics.com ]


As you can see, there are no options to select the EU or European option for LCS geography. As a result, if your environment is hosted in the European LCS region, you will not be able to proceed with the data upgrade.

To solve this issue, you need to do the following:

    1. Edit the "DataMigrationTool.exe.config" file. It is located in the folder where you extracted 
    the Data Migration Toolkit

    2. You need to find the lines: 
    <add key="lcsApiEndpoint" value="https://lcsapi.lcs.dynamics.com/" />
    <add key="lcsApiEndpoint_us" value="https://lcsapi.gov.lcs.microsoftdynamics.us/" />
    <add key="lcsApiEndpoint_uae" value="https://lcsapi.uae.lcs.dynamics.com/" />

    3. Modify the line: 
    <add key="lcsApiEndpoint" value="https://lcsapi.lcs.dynamics.com/" />
    It should be like this:
    <add key="lcsApiEndpoint" value="https://lcsapi.eu.lcs.dynamics.com/" />

    4. Run the DataMigrationTool.exe application again and select the default option. 
    It should now point to the EU LCS geography.

As a result, you should be able to proceed with the data upgrade on the self-service environment hosted in the Europe region (https://eu.lcs.dynamics.com/).

Dynamics 365 Finance and Operations. New development features in version 10.0.40 PU64.

In version 10.0.40 PU64 new features for developers have been introduced.

First of all, it is possible to add modified objects to the current project automatically. It means no more worries about adding objects to the right project. The modified objects will be added automatically. In order to enable this feature, the "Add files to solution on save" option should be enabled.


The second feature is related to the building process. It is possible to select the label file to be built.



All label files will be compiled by default. If you would like to build only labels or speed up the building process you can use this option.

D365 Finance and Operations. Data modifications in Production and Sandbox environments.

Introduction

If you migrate from one of the previous versions (Ax 2009/Ax 2012) you might have a kind of job pack for data correction for typical cases. From my perspective, it makes sense to adjust, test and move those jobs to D365 Finance and Operations so that you can use them by demand as before. 

In addition, you can create and use a set of classes for data corrections that can be applied by demand while the IT team is solving the root cause of the data inconsistencies.

Anyway, the question that is still here: How can we fix urgent data issues in D365 Finance and Operations in Production or Sandbox environments?

Important: Please keep in mind: Any data adjustment in the Production environment must be tested in a Prod copy (Sandbox UAT) environment first. Do not apply data modification operations in the Production environment if they were not tested previously. In the production environment you have no second attempt if the data modification works wrong.

Overview

In Ax 2012 we had access to AOT and could develop a job or open the table via the table browser and adjust data in close to real-time. In D365 Finance and Operations production or Sandbox environments, there is no AOT anymore and the table browser can be opened in the view mode.

Luckily, there are some ways to correct data in D365 Finance and Operations as well.

•  First of all, it is still possible to develop a kind of jobs. In D365 Finance and Operations it is
    called "X++ scripts with zero downtime". I described this standard feature in one of my previous
    posts. With this feature, you can call the standard system business logic and use all available 
    X++ commands. In my opinion, it is the best option if you can’t deploy (for some reason) 
    a new deployable package with the new jobs into the production environment.

•  Another option can be using the Microsoft Dynamics Excel add-in and Data Entities. If you 
    have data entities for the required tables/forms and there are not so many records in the 
    table/form that can be a convenient option. From my perspective, this option can be applied to 
    master data mostly. In addition, the standard entities have edit restrictions similar to their 
    source tables. If you would like to create new entities for data modifications it may be 
    needed to copy standard entities and change their field properties. If you choose this way you 
    need to keep this approach in mind and develop the required entities or add the required fields 
    to the standard entities in advance. However, this is a cheating way as well, as you may 
    bypass some system restrictions. You need to be sure that you understand all the 
    consequences of the changes.

•  Another available option is using the Open Data Protocol (OData). In this case, you need to 
    use the Postman application and connect the application to D365 Finance and Operations 
    instance. Please keep in mind, the data entities that you are going to use should have 
    the "Public" property - "Yes", and the "Public Collection Name" property should be populated. 
    As I mentioned above, you can use the standard entities or create new ones for the data 
    modification purpose.

    In the Internet you can find a lot of posts how to use Postman with D365 Finance and 
    Operations OData entities. Just to give you an idea there are a couple of examples: 
    link1, link2.

The next options are not the best ones, if you ask me. They can be applied only if you know exactly what you are doing and it is your "last hope". With those options, you can make direct changes in the database and bypass validation or other code that would run normally.

  The next option can be using the Just-In-Time (JIT) database access. 
    From a data manipulation perspective, it is the same opportunity you had in Ax 2012 when you
    connected to Ax 2012 SQL Database via SQL Server Management Studio. In this way, 
    you can create, edit, and delete records in Production and Sandbox database tables directly. 
    However, this is not really recommended, as you may bypass validation or other code 
    that would normally run if you were using the system form and table methods for changing 
    data in the table. Before introducing any changes in the tables via SQL you have to 
    understand all the consequences of the changes.

•  The final option is to use a custom tool for data correction. As far as I know, there are some 
    available tools and extensions. For example, this one. You can find other ones. You should be 
    aware that Microsoft may decline support cases if they find this kind of tool in your 
    Prod environment. I have seen this notification in one of the posts on the Microsoft Community 
    forum. Unfortunately, I could not find it to share the link.

Conclusion

As you may see there are different options for data correction. Most likely you will be able to solve any data issue with one of the available options. My personal advice would be: to collect job/classes and Postman queries. As a result, you will have a list of tested classes and Postman requests for data corrections that you can use by demand.

I would recommend using the Just-In-Time (JIT) database access really carefully. Before introducing any changes with this option, you have to understand all the consequences of the changes.

Troubleshooting the package installation to a Tier-1 environment by using LCS

Introduction

When you install the package in a Tier-1 environment by using the LCS portal the process should be completed without any issue in 99%. Unfortunately,  there is still 1% when you may experience various issues.

Issue types


Disc space

Before you install the binary package in a Tier-1 environment via LCS portal (Maintain – Apply updates), please make sure that there is enough space on the disk where the folder *\AosService\PackageLocalDirectory is hosted. Usually, it is the "K" drive.

Otherwise, the preparation process can fail and the LCS environment will be in the "Preparation failed" state. If you download the log, it will be empty and the reason for the failure can be unclear.

Visual Studio

If the process fails at the Preparation stage or later and when you download the log it contains the data about preparation steps only, and the file "PreServicing-xxxxxxxxxxxxxxxxxxxx.error" contains only one line:

The specified module 'C:\Program Files\Microsoft Security Client\MpProvider' was not loaded because no valid module file was found in any module directory.

It may mean that a Visual studio instance was open in the environment when you started the process. In this case, you need to log in to the environment via an RDP connection. Then you should close the Visual Studio instance in all user sessions, close your RDP connection, and resume the process.

If you see no open Visual Studio instances in the environment you need to open Visual Studio in your session. Most probably it will install some updates or components behind. You need to keep it open for 5-10 minutes. Then you should close the Visual Studio instance, close your RDP connection, and resume the process.

Maintenance mode

Occasionally, the environment can be in the maintenance mode. In this case, you can face an issue on the step "GlobalUpdate script for service model: AOSService on machine: Machine Name" and the environment will be in the "Failed" state.

In this case, you need to log in to the environment via an RDP connection and check the Services (Control Panel\System and Security\Administrative Tools\Services). If you see that the "Microsoft Dynamics 365 Unified Operations: Batch Management Service" is stopped and it cannot be started it is the root cause of the issue. You need to be able to start this service otherwise you can’t resume the installation process.

In order to find a reason, you need to take a look at the Windows event logs.

If you see there 3 errors in a row it may mean that the environment is in the maintenance mode.

  The description for Event ID 110 from source Microsoft Dynamics 365 for Finance and 
    Operations cannot be found. Either the component that raises this event is not installed on 
    your local computer or the installation is corrupted. 
    You can install or repair the component on the local computer.

    If the event originated on another computer, the display information had to be saved with the 
    event. The following information was included in the event:

    AX is shutting down due to an error.
    Microsoft.Dynamics.AX.Batch.BatchService+MaintenanceModeBatchStartupException: 
    Exception of type
    'Microsoft.Dynamics.AX.Batch.BatchService+MaintenanceModeBatchStartupException' 
    was thrown.
    at Microsoft.Dynamics.AX.Batch.BatchService.OnStart(String[] args)
    Exception details:
    Microsoft.Dynamics.AX.Batch.BatchService+MaintenanceModeBatchStartupException: 
    Exception of type
    'Microsoft.Dynamics.AX.Batch.BatchService+MaintenanceModeBatchStartupException' 
    was thrown.
    at Microsoft.Dynamics.AX.Batch.BatchService.OnStart(String[] args)

  Faulting application name: Batch.exe, version: 7.0.7198.49, time stamp: 0xb4453638
    Faulting module name: unknown, version: 0.0.0.0, time stamp: 0x00000000
    Exception code: 0x80131623
    Fault offset: 0x00007ffb9cb54093
    Faulting process id: 0x1508
    Faulting application start time: 0x01da8cdf84d8bc61
    Faulting application path: K:\AosService\WebRoot\bin\Batch.exe
    Faulting module path: unknown
    Report Id: 6054eadb-7058-44e4-966c-fcd600a10af7
    Faulting package full name: 
    Faulting package-relative application ID:

 Application: Batch.exe
   Framework Version: v4.0.30319
   Description: The application requested process termination through       System.Environment.FailFast(string message).
   Message: Tearing the process down due to an unhandled exception.
   Stack:
   at System.Environment.FailFast(System.String, System.Exception)
   at <Module>.FailFast(UInt32, System.String, System.Exception)
   at <Module>.OnAppDomainUnhandledException(System.Object,
   System.UnhandledExceptionEventArgs)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.ServiceProcess.ServiceBase.Run(System.ServiceProcess.ServiceBase[])
   at Microsoft.Dynamics.AX.Batch.Entrypoint.Main(System.String[])

To be on the safe side you can use the standard documentation. You can query data from the "SQLSYSTEMVARIABLES" object and if you see that 'CONFIGURATIONMODE' is 1, the system is in the Maintenance mode.

If it is true you need to revert the environment to the normal state. Then you need to start the "Microsoft Dynamics 365 Unified Operations: Batch Management Service" manually. If it  starts - you fixed the issue. After that, you can close your RDP connection, and resume the process. 

Tier-1(CHE) and Unified Developer Experience (UDE) environments build error: Another build is in progress.

In some cases you may face an issue with a model or project build in a Tier-1 environment.The error can be "another x++ build is curren...