Translate

Tuesday, April 15, 2025

How to integrate with Dynamics 365 Finance and Operations Part -2

3️⃣ DMF (Data Management Framework)

DMF allows you to import and export big data in D365FO using various source/target data files and services. This is the recommended way of data integration in D365FO if you need to transfer large amounts of data from other systems using supported file formats and services. It also supports incremental data load/export with change tracking and a data staging feature for post-data validations and processing.



Import and export projects are processed using “data entities, " normalized views of D365FO data tables and can be added programmatically if necessary. There is a dedicated Data management workspace where you can set up data import and export projects, import actual data or see available data entities and check integration mappings in graphical schemas. However useful it is, I cannot tell you that the DMF UI is a user-friendly one. Some parts of the “advanced” and “standard” views confuse users, even developers. Also, field mapping and Unicode setup parts are confusing and do not always work as expected.

There is support for every old file format that was supported by Dynamics AX before, and some new formats and SQL Server DB connection options are available, but there is still no support for modern data transfer file formats like JSON, Parquet, ARRF, Avro, etc.. Also, as package options, you can only use ZIP compression, and there is no support for GZIP, which is widely used for data transfer today. To transfer the data inside a file, CSV files are the recommended option. As you can see in the analysis below from luminousmen.com, a CSV file is quite an outdated file format and falls far behind in every aspect compared to modern data transfer file formats.

https://luminousmen.com/post/big-data-file-formats

DMF has REST APIs and Azure actions to import and export data from outside D365FO. You can find Azure actions to be used in Logic Apps under the “DataManagementDefinitionGroups” section of “Execute action” :



You can find example code (and logic apps) for data import and export using DMF in the Dynamics-AX-Integration GitHub page. However, if you expect a straightforward way of using these API commands to export or import your data, you will be disappointed.. For example, for importing a CSV file, you need to create a ZIP file package containing the file you want to import and add fixed Manifest and Header XML files in it. Then place it in a temporary container DMF provides for you, import it using another action and then use an infinite loop for checking if the import is completed successfully or not, and then try to get the actual error by checking various DMF actions to inform the user about the problem that occurred..
There is currently no other direct file import or export method available; you need to zip it with those fixed Manifest and header files (which you also need to store in a fixed location) and set up workflows in the example code to import them. It would be very nice to have one straightforward option in the future, though, that receives the data file as input and consolidated error and log messages as output..
Another thing to mention is running import jobs in parallel; if you call two APIs in parallel, importing into the same data entity, you get errors and problems with it, although it is possible to execute such an action using async DMF API methods, or async processes in logic apps, etc. Also, the table you need to export should not have any locks on it, or you run into some nasty problems.


✅ Ideal for integrations with legacy systems, ERP migrations, and bulk syncs.

4️⃣ Recurring Integrations

Recurring integrations is a D365FO data integration platform based on DMF and data entities, providing automated data exchange possibilities between third party data providers. Once setup, Recurring Integration platform REST API’s can be used by third party integrators to import/export data from and to D365FO.
Recurring integrations feature can be enabled from a single button click in a DMF project and later can be managed/monitored from D365FO RI admin forms. REST API’s then used to push and pull data generated by this recurring integration, which can then be picked up in an ordered and scheduled way. These API’s used for recurring integrations are fixed and you cannot extend them for custom functionality.
  • How: Upload/download files from Azure Blob via a recurring data job.
  • Automation: Managed using batch jobs or Azure logic apps.



✅ Best for nightly/periodic syncs between systems.


5️⃣ Business Events

D365FO business events feature allows you to send notifications of FO business events to Azure event handlers and trigger-based workflow providers. It comes with many out-of-the-box FO business events, plus provides you the ability to create new ones with X++ programming. You can send notifications from these events to Azure endpoints like Service bus, Event hub, Event grid and Logic apps (or Power Automate), also to custom HTTPS endpoints using standard HTTP methods. It is also possible to subscribe to a business event directly from Logic Apps or Power Automate using the “When a business event occurs” trigger, as shown below:


It is also possible to attach small amounts of data to the notifications using message payload, however, I would advise you to be really careful with that since attaching large amounts of data to your payload will not only create all sorts of problems, it will also ruin the lightweight operation expected from a notification service.
If you need more information, I have a complete blog post about business events, describing also how to create a new event using X++:

  • Examples: PO confirmed, sales invoice posted.
  • Destination: Azure Service Bus, Event Grid, or custom HTTPS endpoint.

✅ Ideal for real-time triggers in distributed systems (e.g., notify WMS when invoice is posted).

More info: https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/business-events/home-page

6️⃣ Entity Store

Many get surprised when they hear D365FO is not really a SQL Server-based software, and its database logic and metadata are not directly accessible via SQL Server. D365FO, in other words, Axapta or AX, stores its metadata, record ID mechanism, data type information and data relations in its own kernel and uses SQL server only as a bulk storage, which is then not directly readable by third-party applications and analytics software. Dinosaurs like me would explain the reason for that as the ancient Damgaard Axapta was designed to be a “multi-database” application and supported Oracle DB as well as SQL Server as a database option. So, at that time, it was a decision to keep this metadata in the kernel instead and make it easier to switch databases if needed.
So, to be able to publish normalized, third-party readable SQL server data from D365FO, we need to use “data entities” and “aggregate measures” for data normalization and export their content to another SQL server database using the “Entity store” and BYOD functionalities. Entity store option publishes aggregate measures (start schema of data entities)(cubes, sort of) of D365FO in a separate SQL server database (AXDW), which can later be read by Azure analytical software (Power BI) to run analytical reports.
Entity store data can be updated manually or using batch jobs. As necessary, there is also an option to develop custom aggregate measures using D365FO development tools. As you can guess, this integration is read-only and, in fact, currently only available to MS Power BI. The reason is, in D365FO production environments, these AXDW databases are not accessible to end users, and you cannot get authorization for any other third-party app if you need to use the same database for your own purposes.. BYOD, on the other hand, which we will mention in the next topic, makes this possible by exposing data to a user database instead.
  • Use case: Power BI reports on operational data.

  • Tools: D365 FO pushes data into Azure SQL (Entity Store), which can be queried directly.

✅ Great for analytical/reporting integrations.

Monday, April 14, 2025

How to integrate with Dynamics 365 Finance and Operations Part -1

Integrating Microsoft Dynamics 365 Finance and Operations (D365 FO) with external systems, applications, and services is essential for building a connected and efficient enterprise ecosystem. This post explores the major integration methods commonly used in real-world projects.

1️⃣ OData (Open Data Protocol)

OData is an open source protocol to serve and consume interoperable data using common query operations with RESTful APIs. D365FO exposes all its public data entities as OData endpoints, which can then be accessed using the following URI format :

https://<env>.cloudax.dynamics.com/data/<EntityName>

OData provides a quick, codeless data integration method with many data query options and CRUD operations. You can use its open standard query string language to query your data and do data manipulations using standard OData CRUD commands, all using just simple and open standard REST calls. If you would like to call a custom method within your data entity, this is also supported by exposing custom actions with your OData entities, so the commands are also extendable to a certain point.

You can definitely use OData for your own integration projects, but there are also many OData-ready software available today, and these can be directly connected to D365FO OData endpoints. Microsoft Power BI also supports OData connection, and you can connect Power BI using OData if you feel lazy with setting up faster data integration possibilities, like Entity Store and Data Lake integrations.

Although it looks like the optimum way of data integration with D365FO, there are some drawbacks involved. OData queries and data operations are executed really slowly, and data reading may take ages if you try to retrieve a large entity. OData is mainly designed for simple CRUD operations and simpler queries. If you need to execute complex queries, like complex joins and lookups, for example, you may start to hit its limits.. Although you can add some custom actions to extend available OData commands with your own ones, complex operations and business logic unfortunately do not go very well with it. It may be required to place this complex logic in a consumer application if you decide to integrate it using OData.

There is also a rather new feature of D365FO to throttle calls to OData endpoints by giving priorities to them, to avoid system lockdowns that might be caused by frequent OData calls. You can read more about it from the link below :

https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/priority-based-throttling

Remember, you can also use OData endpoints in Azure API Manager, just like custom services.



✅ Best for lightweight, real-time CRUD operations.

🔑 Authentication

Like all external D365 FO integrations:

✅ Register an Azure AD app.

✅ Use OAuth 2.0 to get a bearer token.

✅ Send the token in the Authorization header: Authorization: Bearer <access_token>

2️⃣ Custom Services (X++)

This is by far the most flexible and customizable way to integrate with D365FO and the one I use mostly in my integration projects. Custom services are created with standard D365FO X++ code and can be used for both data based and operation based (posting stuff etc.) integrations, limit being just your imagination.
AIF services used in Dynamics AX are now upgraded to custom services in FO, and they are automatically published into SOAP and JSON REST endpoints when deployed. However, management and monitoring tools used for old WCF AIF services have now disappeared, leaving just naked, non-configurable service endpoints.
When you deploy a custom service, the following endpoints are created and you can call them remotely using standard AAD OAuth authorization and SOAP/REST HTTP calls:

REST (JSON): https://<baseUrl>/api/services/<ServiceGroup>/<ServiceName>/ 
SOAP : https://<baseUrl>/soap/services/<ServiceGroup>/<ServiceName> 

Data integration in custom services is done via data contracts, which are then converted to XML or JSON depending on the endpoint. Performance of doing so is quite fast, compared to other data integration options, since everything is handled with low-level .NET commands. You can, in theory, transfer big amounts of data this way; however, I do not recommend doing so. With large data transfer, you may hit some unforeseen limitations and timeout problems on the way. But it is still possible to program some kind of paging mechanism in your service class to send small chunks of larger data to skip these limitations as well.

✅ Use when OData can't support complex validations or custom workflows.

🔑 Authentication

Like all external D365 FO integrations:

✅ Register an Azure AD app.

✅ Use OAuth 2.0 to get a bearer token.

✅ Send the token in the Authorization header: Authorization: Bearer <access_token>

Wednesday, April 2, 2025

Multi Select lookup parameter using UI Builder class In Dynamics 365 Finance and Operations using x++

Today we see how to get multi-select lookup in the report dialog.  This is used to sys operation Framework also. We have to fetch selected records in dp class or service class. For this, I wrote logic in the below classes.

Contract Class :

[
    DataContractAttribute,
    SysOperationContractProcessingAttribute(classstr(UIBuilderClass))
]
class ContractClass
{
    List  vendAccountList;
 
    [
        DataMemberAttribute("Vendor"),
        AifCollectionTypeAttribute("Vendor", Types::String),
        SysOperationLabelAttribute(literalstr("Vend account")),
        SysOperationHelpTextAttribute(literalstr("Vend account.")),
        SysOperationDisplayOrderAttribute('1')
    ]
 
    public List parmVendorList(List _vendAccountList = vendAccountList)
    {
        vendAccountList = _vendAccountList;
        return vendAccountList;
    }
}
_________________________________________________________________________________

UI Builder Class:

class UIBuilderClass extends SrsReportDataContractUIBuilder
{
    ContractClass       myContractClass;
    DialogField         dialogField;
    container           con;
    public void build()
    {
         myContractClass = this.dataContractObject() as ContractClass;
         dialogField   = this.addDialogField(methodStr(ContractClass, parmVendorList), ContractClass);
    }
 
    public void postBuild() //or postRun() we can use any method.
    {
        super();
        myContractClass = this.dataContractObject() as ContractClass;
        dialogField     = this.bindInfo().getDialogField(ContractClass,
                                           methodStr(ContractClass, parmVendorList));
        dialogField.registerOverrideMethod(methodStr(FormStringControl, lookup),
                                           methodStr(UIBuilderClass, VendorLookup), this);
        methodStr(UIBuilderClass, Vendormodified), this); // modified method.
    }
 
    public void VendorLookup(FormStringControl _control)
    {
        Query                   query = new Query();
        QueryBuildDataSource    vendQBD;
        vendQBD = query.addDataSource(tableNum(VendTable));
        vendQBD.addSelectionField(fieldNum(VendTable, AccountNum));
        SysLookupMultiSelectGrid::lookup(query,
                                         _control,
                                         _control,
                                         _control,
                                         con);
    }
}
__________________________________________________________________________________

DP Class:

[
    SRSReportQueryAttribute (querystr(Myquery)),
    SRSReportParameterAttribute(classstr(ContractClass))
]
class DPClass extends SRSReportDataProviderBase //SrsReportDataProviderPreProcess
{
    MyTable             myTable;
    List                vendorList;
    Vendtable           vendTable;
 
    [SRSReportDataSetAttribute(tableStr('MyTable'))]
    public MyTable getTempMyTable()
    {
        select MyTable;
        return MyTable;
    }
 
    private Query buildQuery(Query  _query,List _vendorList)
    {
        ListIterator listIterator = new ListIterator(_vendorList);
 
        while(listIterator.more())
        {
            _query.dataSourceTable(tablenum(VendTable)).addRange(fieldnum(VendTable, AccountNum)).value(queryValue(listIterator.value()));
            listIterator.next();
        }
 
        return _query;
    }
 
    public void processReport()
    {
        QueryRun            queryRun;
        TestContractClass   contract = this.parmDataContract() as TestContractClass;
        Query               query    = this.parmQuery();
        ;
 
        vendorList = contract.parmVendorList();
 
        queryRun = new QueryRun(vendorList.empty() ? query : this.buildQuery(this.parmQuery(), vendorList));
 
        while(queryRun.next())
        {
            vendTable = queryRun.get(tablenum(VendTable));
            this.insertIntoTempTable();
        }
    }
 
    private void insertIntoTempTable()
    {
        myTable.AccountNum  = vendTable.AccountNum;
        myTable.VendGroup   = vendTable.VendGroup;
        myTable.insert();
    }
}