Data Migration Toolkit for Dynamics 365 Experience Overview

Introduction

Previously, I wrote a post about a data upgrade in self-service environments. This time I would like to share my experience with the "Data Migration Toolkit for Dynamics 365". 

Parameters

Based on the standard documentation, to optimize the replication latency/performance, you can update the following distributor parameters in the App.config file:

  • MaxBcpThreads – By default, this parameter is set to 6. If the machine has fewer than six cores, update the value to the number of cores. The maximum value that you can set is 8.

  • NumberOfPublishers – By default, this parameter is set to 2. The common recommendation is to use this value. However, there can be situations where you may want to increase the number of publishers, to distribute smaller numbers of tables to each publisher. This in conjunction with the manual snapshot start process, allows you to run smaller initial snapshots, that can be useful if you have limited maintenance windows and must split the startup of the replication over several.

  • SnapshotPostPublication - This option will add in a 5-minute delay between automatic snapshot processes starting, that can assist with loads on the source server. The toolkit also allows manual snapshot starts, if you choose that option, you don't need to set this.

Let me share my experience with the parameters:

  • MaxBcpThreads – I adjusted the parameter according to the recommendations. 

  • NumberOfPublishers - During my data migration projects if the Ax 2012 database was not so huge (30-50 Gb) I used the default NumberOfPublishers value. If the database is 80 Gb or bigger it’s better to select “Yes” to the “Manual startup” option on Step 5 when doing the replication, then run each publication one at a time, and wait for it to push the snap show before you start the next one. Also, in the toolkit config file “DataMigrationTool.exe.config”, for larger databases it’s better to set the number of publishers to 4 or 6. The advantage is that if one of the publishers has an issue, you only then have to reinitialize 25% or less of the data.

  • SnapshotPostPublication – I have not changed this parameter. Each time the process worked fine with the default value.

    One more piece of advice:

    Each time I need to perform the data migration in Tier-2 or higher environments I deploy a new environment on LCS from the scratch. I have tried to perform the data migration with the same LCS self-service environment but the process failed and I could not resume it. After this case, I prefer to deploy a new LCS Tier-2 and higher environment for data migration and not waste my time on unnecessary troubleshooting.

    And don't forget to clean up extra data in Ax 2012 with the standard operations. It can reduce the time required for the data upgrade and you may avoid the issues mentioned below.

    Experienced issues

    There is a brilliant troubleshooting guide from Microsoft. It is really useful and contains reasonable advice. If you have any issues with the data upgrade you have to look at it first. It is constantly updating by Microsoft. Therefore I have experienced the issues that have not been described in the guide yet.

    Batch step failed with a time-out 

    The message was: The postsync-wait for batch step is in in-progress state for more than 240 minutes.

    I asked Microsoft support team for advice and the response was:

    In some scenarios its expected behaviour for the sandbox environment to longer time to finish, hence request you to wait for step to be completed using the DS command, then resume the operation.

    If you can get JIT access, you can try to run the following SQL query to check for current queries running in the database, you may need to run the query a few times to check:

    SELECT   
    SPID   = er.session_id, S
    STATUS  = ses.STATUS, 
    [Login]        = ses.login_name, 
    Host           = ses.host_name,
    BlkBy          = er.blocking_session_id, 
    DBName  = DB_Name(er.database_id), 
    CommandType    = er.command,
    ObjectName     = OBJECT_NAME(st.objectid),
    CPUTime        = er.cpu_time,
    StartTime      = er.start_time,
    TimeElapsed    = CAST(GETDATE() - er.start_time AS TIME),
    SQLStatement   = st.text
    FROM    sys.dm_exec_requests er
        OUTER APPLY sys.dm_exec_sql_text(er.sql_handle) st
        LEFT JOIN sys.dm_exec_sessions ses
        ON ses.session_id = er.session_id
    WHERE   st.text IS NOT NULL

    Once the final step is "Completed" you can then make a Resume from the toolkit. This will quickly cycle through the servicing again in LCS to get the environment status to "Completed". That might take around 10-20 minutes, sometimes less.

    In my particular case, this solution worked.

    DBSync step failed with a time-out

    It is happening due to the size of the tables and the precision changing on numeric fields between AX 2012 and D365. SQL Server is exceptionally slow when doing an alter table on these data types. If the table size in Ax 2012 is more than 5Gb and this change type was applied to the table fields in D365 you will probably face a synchronization issue.

    To get past this, run DS from the toolkit then resume, it will get through but could timeout on another table.

    If there are multiple huge tables in Ax 2012 I would recommend to ask Microsoft for support. They have a solution for this. It is called the Shadow Copy sync.

    INVENTSUMLOGTTS table size

    In some cases, the INVENTSUMLOGTTS table can have a huge size. In Ax 2012 the INVENTSUMLOGTTS table is used for two purposes:

    • As part of the Inventory Multi Transaction System (IMTS) in order to roll back unsuccessful transactions.

    • It allows Master scheduling to do an update based on changes to the on hand. After the update, the records are deleted by Master scheduling.  
    When a customer is registered for Master planning but does not use Master scheduling, this table will continue to grow and will never get cleared out. Running Master scheduling will automatically clear the contents of this table.  

    If you are registered for Master planning but are not running Master scheduling, you can manually purge the table through SQL with the following statement: 
    DELETE INVENTSUMLOGTTS WHERE ISCOMMITTED =1

    No comments:

    How to run batch tasks using the SysOperation framework

    Overview As you may know, the system has batch tasks functionality . It can be used to create a chain of operations if you want to set an or...