Translate

Tuesday, April 15, 2025

How to integrate with Dynamics 365 Finance and Operations Part -2

3️⃣ DMF (Data Management Framework)

DMF allows you to import and export big data in D365FO using various source/target data files and services. This is the recommended way of data integration in D365FO if you need to transfer large amounts of data from other systems using supported file formats and services. It also supports incremental data load/export with change tracking and a data staging feature for post-data validations and processing.



Import and export projects are processed using “data entities, " normalized views of D365FO data tables and can be added programmatically if necessary. There is a dedicated Data management workspace where you can set up data import and export projects, import actual data or see available data entities and check integration mappings in graphical schemas. However useful it is, I cannot tell you that the DMF UI is a user-friendly one. Some parts of the “advanced” and “standard” views confuse users, even developers. Also, field mapping and Unicode setup parts are confusing and do not always work as expected.

There is support for every old file format that was supported by Dynamics AX before, and some new formats and SQL Server DB connection options are available, but there is still no support for modern data transfer file formats like JSON, Parquet, ARRF, Avro, etc.. Also, as package options, you can only use ZIP compression, and there is no support for GZIP, which is widely used for data transfer today. To transfer the data inside a file, CSV files are the recommended option. As you can see in the analysis below from luminousmen.com, a CSV file is quite an outdated file format and falls far behind in every aspect compared to modern data transfer file formats.

https://luminousmen.com/post/big-data-file-formats

DMF has REST APIs and Azure actions to import and export data from outside D365FO. You can find Azure actions to be used in Logic Apps under the “DataManagementDefinitionGroups” section of “Execute action” :



You can find example code (and logic apps) for data import and export using DMF in the Dynamics-AX-Integration GitHub page. However, if you expect a straightforward way of using these API commands to export or import your data, you will be disappointed.. For example, for importing a CSV file, you need to create a ZIP file package containing the file you want to import and add fixed Manifest and Header XML files in it. Then place it in a temporary container DMF provides for you, import it using another action and then use an infinite loop for checking if the import is completed successfully or not, and then try to get the actual error by checking various DMF actions to inform the user about the problem that occurred..
There is currently no other direct file import or export method available; you need to zip it with those fixed Manifest and header files (which you also need to store in a fixed location) and set up workflows in the example code to import them. It would be very nice to have one straightforward option in the future, though, that receives the data file as input and consolidated error and log messages as output..
Another thing to mention is running import jobs in parallel; if you call two APIs in parallel, importing into the same data entity, you get errors and problems with it, although it is possible to execute such an action using async DMF API methods, or async processes in logic apps, etc. Also, the table you need to export should not have any locks on it, or you run into some nasty problems.


✅ Ideal for integrations with legacy systems, ERP migrations, and bulk syncs.

4️⃣ Recurring Integrations

Recurring integrations is a D365FO data integration platform based on DMF and data entities, providing automated data exchange possibilities between third party data providers. Once setup, Recurring Integration platform REST API’s can be used by third party integrators to import/export data from and to D365FO.
Recurring integrations feature can be enabled from a single button click in a DMF project and later can be managed/monitored from D365FO RI admin forms. REST API’s then used to push and pull data generated by this recurring integration, which can then be picked up in an ordered and scheduled way. These API’s used for recurring integrations are fixed and you cannot extend them for custom functionality.
  • How: Upload/download files from Azure Blob via a recurring data job.
  • Automation: Managed using batch jobs or Azure logic apps.



✅ Best for nightly/periodic syncs between systems.


5️⃣ Business Events

D365FO business events feature allows you to send notifications of FO business events to Azure event handlers and trigger-based workflow providers. It comes with many out-of-the-box FO business events, plus provides you the ability to create new ones with X++ programming. You can send notifications from these events to Azure endpoints like Service bus, Event hub, Event grid and Logic apps (or Power Automate), also to custom HTTPS endpoints using standard HTTP methods. It is also possible to subscribe to a business event directly from Logic Apps or Power Automate using the “When a business event occurs” trigger, as shown below:


It is also possible to attach small amounts of data to the notifications using message payload, however, I would advise you to be really careful with that since attaching large amounts of data to your payload will not only create all sorts of problems, it will also ruin the lightweight operation expected from a notification service.
If you need more information, I have a complete blog post about business events, describing also how to create a new event using X++:

  • Examples: PO confirmed, sales invoice posted.
  • Destination: Azure Service Bus, Event Grid, or custom HTTPS endpoint.

✅ Ideal for real-time triggers in distributed systems (e.g., notify WMS when invoice is posted).

More info: https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/business-events/home-page

6️⃣ Entity Store

Many get surprised when they hear D365FO is not really a SQL Server-based software, and its database logic and metadata are not directly accessible via SQL Server. D365FO, in other words, Axapta or AX, stores its metadata, record ID mechanism, data type information and data relations in its own kernel and uses SQL server only as a bulk storage, which is then not directly readable by third-party applications and analytics software. Dinosaurs like me would explain the reason for that as the ancient Damgaard Axapta was designed to be a “multi-database” application and supported Oracle DB as well as SQL Server as a database option. So, at that time, it was a decision to keep this metadata in the kernel instead and make it easier to switch databases if needed.
So, to be able to publish normalized, third-party readable SQL server data from D365FO, we need to use “data entities” and “aggregate measures” for data normalization and export their content to another SQL server database using the “Entity store” and BYOD functionalities. Entity store option publishes aggregate measures (start schema of data entities)(cubes, sort of) of D365FO in a separate SQL server database (AXDW), which can later be read by Azure analytical software (Power BI) to run analytical reports.
Entity store data can be updated manually or using batch jobs. As necessary, there is also an option to develop custom aggregate measures using D365FO development tools. As you can guess, this integration is read-only and, in fact, currently only available to MS Power BI. The reason is, in D365FO production environments, these AXDW databases are not accessible to end users, and you cannot get authorization for any other third-party app if you need to use the same database for your own purposes.. BYOD, on the other hand, which we will mention in the next topic, makes this possible by exposing data to a user database instead.
  • Use case: Power BI reports on operational data.

  • Tools: D365 FO pushes data into Azure SQL (Entity Store), which can be queried directly.

✅ Great for analytical/reporting integrations.