Live sync

Completed

For live sync to work correctly, and to start, you’ll need to meet specific prerequisites. All changes that need to happen for a particular data source should be in the same transaction. However, entities shouldn’t have set-based updates or inserts that are set up or currently running on them because these updates or inserts won’t trigger business events, and subsequently, dual-write. If the property on the skipBusinessEvents entity is set to true, then the live sync won’t trigger, and the entity will be skipped by live sync. Occasionally, entities with data sources will be connected by using an outer join. If these entities are marked as read-only outer joins, then the data sources won’t have Track Changes eligible for live sync.

Changes are only triggered if the modifications show on the mapped fields in finance and operations apps. All fields and modifications from Dataverse will trigger a dual-write sync. Filter evaluations need to provide a valid result; otherwise, live sync won’t trigger. You’ll need to check field maps on the dual-write project; if the field isn’t mapped, then it won’t be tracked. Additionally, if other data sources with fields are present and aren’t mapped, such as when you have an entity with multiple data sources and one is only used for a relation, then these fields won’t trigger a live sync and won’t be tracked.

If the mapping records are present in the DualWriteProjectConfiguration,DualWriteProjectFieldConfiguration, and BusinessEventsDefinition tables, then they'll trigger live sync and changes will be tracked. If you find that maps are active in dual-write, but they’re not in the listed tables, then you can use the Excel add-in to change the records, add the extra fields and mapped data to them, and then ensure that they'll work with live sync.

Troubleshoot live sync

Troubleshooting live sync can be difficult. In this situation, you can use telemetry-based data from Dynamics Lifecycle Services to gather information and list table changes that lead up to business events, view the list of synced records, and provide logs for activity. The following command list describes how to use each option for troubleshooting needs.

Command Result
ProviderName == "Microsoft-Dynamics-AX-BusinessEventsRuntime“ TaskName == "DualWriteRecordsCached“ & “TableToEntityQueryProcessed” List table changes leading up to the business events with the number of records
ProviderName == “Microsoft-Dynamics-AX-DualWriteSync” & activityName == “DualWriteOutbound.WriteToCDS” List the synced records from finance and operations apps to Dataverse
ProviderName == “Microsoft-Dynamics-AX-DualWriteSync” & activityName == “DualWriteInbound.WriteToEntity” List the synced records from Dataverse to finance and operations apps
All Events for Activity = Activity ID All logs for a particular Activity ID

Development environments have event viewer logs for dual-write that might be helpful as well. For more information, see the Event Viewer path Applications and Services logs > Microsoft > Dynamics > AX-DualWriteSync for those logs. For more information see, General troubleshooting about troubleshooting information for dual-write integration between finance and operations apps and Dataverse.

Additionally, a debug mode is available for finance and operations apps sync to Customer Engagement apps that will help you get detailed error logs. Even if no error has occurred, you can still view the activity ID to check the logs from Lifecycle Services or other information that could be useful from this experience. In the Excel add-in, you can open the DualWriteProjectConfigurationEntity table to set the IsDebugMode field to True for one or many table mappings. If you publish these changes to the DualWriteProjectConfigurationEntity table for the maps that are having trouble, or if you want more information from the add-in, you can access the logs in the DualWriteErrorLog table.

To look up the data in a table browser, go to the following link: https://999aos.cloudax.dynamics.com/?mi=SysTableBrowser&tableName=DualWriteErrorLog, replacing 999 as needed. Alternatively, you can query this table in SQL Server Management Studio on a development or Tier 2 environment. The DetailedErrorMessage column contains two pieces:

  • VerboseError – Captures the long text of the error from Dataverse or the Customer Engagement apps environment.

  • DebugLogTrace - Captures the actual batch request that’s sent to the Customer Engagement environment along with the payload.

Again, you can find the ActivityId for the errors, which you can use to check all logs for the activity from Lifecycle Services environment monitoring.

If errors occur from Dataverse to finance and operations apps, you can troubleshoot from the Customer Engagement apps Trace Plug-in. In the System settings menu (Dynamics 365 environment > Settings > System > Administrator > System Settings), on the Customization tab, you can turn on trace logging for the plug-in and customer workflow activity. In the Workflow activity tracing column, select All to enable the trace log for all activities. Otherwise, you can specify to log only when exceptions occur instead. To view this trace log, you’ll need to open the Settings page in your Dataverse environment. Under Customization, select Plug-in trace log. Make sure that you find the logs where the Type Name column is set to Mircosoft.Dynamics.Integrator.DualWriteRuntime.Plugins.PreCommitPlugin. Double-click an item to view the full log, and then on the Execution FastTab, review the message block text.

Performance troubleshooting and best practices

Performance is a process that needs attention to ensure that the data is moving from system to system as expected without issues that slow down the system. To avoid unnecessary issues, consider the following information and best practices to make sure that the integration works as intended.

If you’re performing a data migration, we recommend that you stop the dual-write maps. This action will avoid business event triggers from syncing the data over live sync. It also ensures that, when the data has been migrated, it will be synced between systems.

To optimize performance and avoid overloading the apps, a limit has been established for project implementations at 500,000 rows for each implementation for each project for an initial sync. If more than 500,000 rows are needed in a run, when you skip the initial sync, we recommend that you migrate the data into finance and operations apps and Dataverse separately and then skip the initial synchronization. Additionally, if you’re running an initial sync from Dataverse to finance and operations apps, the import result must be received back from finance and operations apps within 24 hours; otherwise, a time-out will occur.

A limit of 40 legal entities is enforced while the environments are being linked. If you try to enable maps where more than 40 legal entities are linked between environments, you’ll receive an error during initial synchronization. However, live synchronization supports up to 250 legal entities for each transaction. For more information, see Dual-write limits for live synchronization.

You can use computed columns and virtual columns on the Finance and Operation apps entities. Make sure that you monitor these columns for overhead performance from the extra logic that’s needed for the reads and writes of these fields. If you’re using virtual columns to transform or calculate extra values through X++ and expect to return to Dataverse in the same transaction, know that this action isn’t allowed and will fail. Plan to change how this process works to ensure that dual-write and the linked environments will function properly.