Translate

Tuesday, October 28, 2025

How to integrate with Dynamics 365 Finance and Operations Part -5

 14) Azure Synapse Link for Dynamics 365 Finance & Operations

Azure Synapse Link for Dataverse is Microsoft’s next-generation data export and analytics integration tool, replacing the legacy Export to Data Lake feature.

It allows you to export data from Dynamics 365 Finance & Operations (F&O) to Azure Data Lake Storage Gen2 and/or Azure Synapse Analytics — providing low-cost, high-performance storage for analytical workloads.


The setup is straightforward: from the Power Platform Synapse Link tab, you can easily select the F&O entities and tables to be linked for export.




⚙️ How Synapse Link Works

The Synapse Link data export process operates similarly to the old Export to Data Lake feature — but with significant improvements.

You now benefit from:

  • Efficient Parquet and Delta file formats for faster data processing

  • ๐Ÿ” Continuous replication, ensuring near real-time synchronization

  • ๐Ÿค Seamless collaboration with other Microsoft analytical tools inside Synapse Analytics

Once data is replicated from F&O, you can perform advanced analytics and machine learning projects within Synapse Analytics — combining data from multiple sources.
These results can then be consumed in Power BI dashboards, or integrated further using Synapse pipelines and Fabric experiences.

๐Ÿงฉ Custom Tables and Limitations

Not all F&O tables are available in the Synapse Link configuration by default.
If you don’t see your custom tables, enable the “Row version change tracking” property on your table — this usually makes it appear in the available list.

The data export process uses modern Parquet and Delta formats, ensuring high-speed performance and efficient storage in the data lake.

✅ Advantages

  • High-performance, efficient storage for analytics using Delta Lake

  • ๐Ÿ” Continuous data synchronization with Parquet/Delta formats

  • ๐Ÿ“ฆ Common Data Model (CDM) support

  • ๐Ÿ” Full access to Synapse Analytics tools for data exploration and transformation

❌ Disadvantages

  • ๐Ÿ”„ One-way integration — designed for analytical projects, not transactional data sync


15) Link to Microsoft Fabric (Preview)

Microsoft Fabric is the newest evolution in Microsoft’s analytics and integration ecosystem — a next-generation platform that builds upon Synapse Link for Dataverse.

It enhances data analytics, business intelligence, and integration capabilities by linking Dynamics 365 Finance & Operations (F&O) data directly to Fabric.

Unlike Synapse Link, the Link to Microsoft Fabric option doesn’t require copying data into an external data lake. Instead, it creates data source shortcuts and caches data internally in high-performance and efficient formats only when needed.

Once connected, you can leverage the latest Fabric capabilities — including Fabric Data Factory, Power BI DirectLake, and OneLake — to create powerful, unified data experiences.

⚙️ How It Works

The linking process is currently done via the same Synapse Link tab in the Power Platform, but instead of selecting Synapse, you choose the Microsoft OneLake option.

By default, it automatically links all Dataverse tables, including those synced from F&O.

Rather than creating additional Azure Data Lake storage, Fabric reuses Dataverse storage and OneLake caching for analytics and integrations.

You can even connect existing Azure Synapse Links with Fabric to access its new tools, such as:

  • ๐Ÿ”„ Fabric Data Pipelines

  • Power BI DirectLake mode

  • ๐Ÿ” Real-time data analytics within the unified Fabric workspace

๐Ÿ’ฐ Pricing

Microsoft Fabric operates on a capacity-based pricing model, available either as pay-as-you-go or reserved capacity.

  • The lowest available SKU is F2, but it offers limited performance.

  • For most small to medium ERP analytics or integration workloads, F16 or F32 are the recommended tiers — though they come with higher costs.

At the time of writing, Link to Fabric remains in preview, and Microsoft continues to evolve its capabilities.

✅ Advantages

  • ๐Ÿš€ Future-proof platform for BI, analytics, and data integration

  • Real-time data access and processing

  • ☁️ No additional data storage required

  • ๐Ÿ”— Use F&O data seamlessly with Fabric tools such as Data Factory pipelines and Power BI DirectLake

❌ Disadvantages

  • ๐Ÿ’ก Can be overkill for small to medium integration scenarios


16) Dynamics Finance and Operations MCP Server

MCP servers (Microsoft Copilot Plug-ins) are standardized interfaces for AI platforms, enabling AI agents such as Microsoft Copilot to interact with external business systems.

The Finance & Operations MCP Server allows Copilot — and other MCP-compatible AI agents — to access a limited set of F&O business logic tools directly from ERP data and processes.

Currently, the server provides access to 13 built-in tools, but they are not extensible and do not allow direct F&O data access yet.

⚙️ Setup and Usage

Adding the F&O MCP Server to Copilot is simple:
Just select “Dynamics 365 ERP MCP” from the list of available tools when configuring your agent.

At the moment, Microsoft hasn’t released official documentation for connecting it to third-party AI platforms — though that’s expected in future updates.

If you need an F&O MCP server with more tools and possibilities, you either need to wait until this one gets improved or use third-party F&O MCP server options available on the internet.

✅ Advantages

  • ๐Ÿง  Enables Copilot integration with F&O business logic

  • ๐Ÿค Easy configuration inside Microsoft Copilot Studio

❌ Disadvantages

  • ๐Ÿšซ No direct access to F&O data (yet)

  • ⚙️ Limited number of tools currently available

  • ๐Ÿงฑ Tools and functionality are not extensible

Friday, June 27, 2025

๐Ÿ“Š How to Split Financial Dimensions into Columns Using SQL in D365 F&O

In Microsoft Dynamics 365 Finance and Operations, financial dimensions are stored in a compressed form using fields like LedgerDimension and DefaultDimension. While this is great for system efficiency, it’s not very human-readable—especially when you're running queries or building reports directly from the database.

In this article, I'll show you how to split LedgerDimension and DefaultDimension into individual columns using SQL, so that each dimension (like CostCenter, ItemGroup, etc.) appears in its own field. This is very useful when building reports, integrating with Power BI, or analyzing data directly from SQL Server.

๐Ÿงฉ 1. Split Ledger Dimension into Individual Columns

Use this query when you want to extract Main Account + Dimensions (i.e., full ledger dimension combination used in transactions):

SELECT
    Dimension,
    MainAccount,
    Claims,
    Functions,
    ItemGroup,
    CostCenter,
    Market_IC,
    MTO,
    Projects,
    Vendor,
    Agreement
FROM
(
    SELECT
        DAVC.RecId AS Dimension,
        DA.Name,
        DALVAV.DisplayValue
    FROM
        DimensionAttributeValueCombination AS DAVC
        JOIN DimensionAttributeLevelValueAllView AS DALVAV
            ON DALVAV.VALUECOMBINATIONRECID = DAVC.RecId
        JOIN DimensionAttribute AS DA
            ON DA.RecId = DALVAV.DIMENSIONATTRIBUTE
    WHERE
        DAVC.MainAccount != 0
) AS SourceData
PIVOT
(
    MAX(DisplayValue)
    FOR Name IN (
        MainAccount, Claims, Functions, ItemGroup, CostCenter,
        Market_IC, MTO, Projects, Vendor, Agreement
    )
) AS PivotTable;

 ๐Ÿ” Note: Replace the dimension names in the IN (...) clause with the actual dimension names configured in your environment.


๐Ÿ“ 2. Split Default Dimension into Individual Columns

Use this query when you're dealing with DefaultDimension (used in master records like customers, vendors, items, etc.).

SELECT
    DefaultDimension,
    Claims,
    Functions,
    ItemGroup,
    CostCenter,
    Market_IC,
    MTO,
    Projects,
    Vendor,
    Agreement
FROM
(
    SELECT
        DAVC.RecId AS DefaultDimension,
        DA.Name,
        DAVSI.DisplayValue
    FROM
        DimensionAttributeValueCombination AS DAVC
        JOIN DimensionAttributeValueSet AS DAVS
            ON DAVS.RecId = DAVC.RecId
        JOIN DimensionAttributeValueSetItem AS DAVSI
            ON DAVSI.DIMENSIONATTRIBUTEVALUESET = DAVS.RecID
        JOIN DimensionAttributeValue AS DAV
            ON DAV.RecId = DAVSI.DIMENSIONATTRIBUTEVALUE
        JOIN DimensionAttribute AS DA
            ON DA.RecId = DAV.DIMENSIONATTRIBUTE
    WHERE
        DAVC.MainAccount = 0
) AS SourceData
PIVOT
(
    MAX(DisplayValue)
    FOR Name IN (
        Claims, Functions, ItemGroup, CostCenter,
        Market_IC, MTO, Projects, Vendor, Agreement
    )
) AS PivotTable;
 

๐Ÿ’ก Use Cases

  • Exporting transaction data for reporting

  • Creating simplified views for Power BI

  • Debugging the ledger and default dimensions

  • Building a staging table for the data warehouse

✅ Tips

  • These queries are meant for read-only reporting purposes.

  • Use them in your Data Lake, BYOD, or on-prem reporting environments.

  • Ensure dimension names match your environment’s configuration.

๐Ÿ”— Conclusion

Splitting LedgerDimension and DefaultDimension into readable columns using SQL is a powerful technique to simplify financial data analysis. This is especially helpful for technical consultants, BI developers, and finance teams working on reporting, integrations, or validations.

Let me know in the comments if you'd like a downloadable view or Power BI model version of this!

Happy coding! ๐Ÿ’ป


๐Ÿ” Stay tuned for more D365FO technical posts

๐Ÿ“Œ Bookmark: https://d365fohunt.blogspot.com

☁️ Execute D365 F&O SSRS Report with Parameters and Upload Report Output to Azure Blob Storage Using X++

In many real-world scenarios, businesses need to automatically generate SSRS reports from Dynamics 365 Finance & Operations (D365FO) and store them externally for auditing, sharing, or integration purposes. This blog walks you through how to execute an SSRS report with parameters, convert it into a PDF byte stream, and upload the output to Azure Blob Storage using X++ and the latest Azure SDK.

In this article, we'll walk through how to:

  • Execute a parameterised SSRS report using X++

  • Render the report as a PDF byte stream

  • Upload the output to Azure Blob Storage using the latest Azure SDK (Azure.Storage.Blobs)

✅ Why Use This?

Uploading reports to Azure Blob Storage allows you to:

  • Automate report delivery

  • Store large reports securely

  • Integrate with Power BI, Logic Apps, or third-party APIs

  • Replace manual downloads or email-based report distribution

✅ Steps to Achieve This

  1. Create a controller class to run the SSRS report with parameters.

  2. Create an Azure Storage Account in your Azure subscription.

  3. Create Blob containers and optional folders to organise your report files.

  4. Get the connection string from Azure to authenticate from X++.

  5. Render the SSRS report to a byte stream in PDF format.

  6. Upload the stream to Azure Blob Storage using Azure.Storage.Blobs.

๐Ÿ›  Prerequisites

  1. Add references to these .NET assemblies:
    • Azure.Storage.Blobs
    • System.IO
  2. Ensure your storage account connection string and container are correctly configured.

๐ŸŒ How to Set Up Azure Storage for Report Uploads

Follow these steps in the Azure Portal to create the required storage setup:

๐Ÿ”น Step 1: Log in

Visit https://portal.azure.com and sign in with your Azure credentials.

๐Ÿ”น Step 2: Create Storage Account

  • Navigate to Storage accounts

  • Click + Create

  • Fill in details: Subscription, Resource Group, Name, Region

  • Choose Standard performance and Hot access tier (default)

๐Ÿ”น Step 3: Create Blob Container

  • Once the storage account is created, go to it

  • Select Storage browser from the left menu

  • Click on Blob containers

  • Create a new container (e.g., reportupload)

    • Set access level to Private (recommended for security)

๐Ÿ”น Step 4: (Optional) Create Folders

  • Open your container

  • Click + Add directory to create folders like ReportUpload

๐Ÿ”น Step 5: Get the Connection String

  • Go to the Access keys section under Security + networking

  • Copy the Connection string of key1

    • Example format:

(DefaultEndpointsProtocol=https;AccountName=youraccount;AccountKey=yourkey;EndpointSuffix=core.windows.net)

๐Ÿ”„ What's Changed?

The legacy WindowsAzure.Storage package is deprecated. The new library:

  • Uses Azure.Storage.Blobs.

  • Has updated method names and authentication mechanisms.

  • It is available via NuGet: Azure.Storage.Blobs

๐Ÿ’ป X++ Code with Latest SDK

using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Specialized;
using System.IO;
 
public class ExampleUploadReportToBlobStorage
{
    public static void main(Args _args)
    {
        ExampleUploadReportToBlobStorage uploadObj = new ExampleUploadReportToBlobStorage();
        BlobContainerClient containerClient = uploadObj.connectToAzureBlob();
        uploadObj.uploadFileToBlob(containerClient);
    }
 
    public BlobContainerClient connectToAzureBlob()
    {
        str connectionString =
            "DefaultEndpointsProtocol=https;AccountName=youraccountname;AccountKey=yourkey;EndpointSuffix=core.windows.net";
 
        BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
        BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("reportupload");
 
        info(strFmt("Connected to Azure Blob Container: %1", containerClient.getName()));
        return containerClient;
    }
 
    public void uploadFileToBlob(BlobContainerClient containerClient)
    {
        System.Byte[] reportBytes = ExampleUploadReportToBlobStorage::getSSRSBytes("ReportOutput");
 
        if (reportBytes)
        {
            str blobPath = "ReportUpload/UploadFile.pdf"; // folder + filename
            BlobClient blobClient = containerClient.GetBlobClient(blobPath);
 
            System.IO.MemoryStream stream = new System.IO.MemoryStream(reportBytes);
            blobClient.Upload(stream, true); // Overwrite if file exists
 
            info("File uploaded successfully to Azure Blob Storage.");
        }
    }
 
    public static System.Byte[] getSSRSBytes(str _fileName)
    {
        SrsReportRunController controller = new SrsReportRunController();
        SRSPrintDestinationSettings settings;
        System.Byte[] reportBytes;
        SRSProxy srsProxy;
        SRSReportRunService srsService = new SRSReportRunService();
        Map paramMap;
        SRSReportExecutionInfo execInfo = new SRSReportExecutionInfo();
 
        controller.parmReportName(ssrsReportStr(TransactionsReport, Report));
        controller.parmShowDialog(false);
        controller.parmLoadFromSysLastValue(false);
 
        // Set report parameters
        ReportParametersDataContract contract = controller.parmReportContract().parmRdpContract() as ReportParametersDataContract;
        contract.parmFromDate(today() - 5);
        contract.parmToDate(today());
 
        // Configure output
        settings = controller.parmReportContract().parmPrintSettings();
        settings.printMediumType(SRSPrintMediumType::File);
        settings.fileName(_fileName + ".pdf");
        settings.fileFormat(SRSReportFileFormat::PDF);
 
        controller.parmReportContract().parmReportServerConfig(SRSConfiguration::getDefaultServerConfiguration());
        controller.parmReportContract().parmReportExecutionInfo(execInfo);
 
        srsService.getReportDataContract(controller.parmReportContract().parmReportName());
        srsService.preRunReport(controller.parmReportContract());
 
        paramMap = srsService.createParamMapFromContract(controller.parmReportContract());
        Microsoft.Dynamics.AX.Framework.Reporting.Shared.ReportingService.ParameterValue[] paramArray =
            SrsReportRunUtil::getParameterValueArray(paramMap);
 
        srsProxy = SRSProxy::constructWithConfiguration(controller.parmReportContract().parmReportServerConfig());
 
        reportBytes = srsProxy.renderReportToByteArray(
            controller.parmReportContract().parmReportPath(),
            paramArray,
            settings.fileFormat(),
            settings.deviceinfo());
 
        return reportBytes;
    }
}

๐Ÿ” Security Tips

  • Avoid hardcoding the connection string in production. Use Key Vault or Azure App Configuration.

  • Always restrict access to the container using RBAC or SAS tokens instead of full account keys.

  • Set your Blob container access level to Private unless you explicitly need public access.

  • Use Shared Access Signatures (SAS) or Managed Identities in production scenarios.

๐Ÿ“Œ Business Use Cases

This solution is ideal for:

  • Automatically exporting and archiving daily/weekly/monthly reports

  • Sharing output with external systems or teams via Azure Storage

  • Generating custom reports on demand and storing them securely in the cloud

✅ Summary

In this post, we’ve seen how to:

✔️ Execute an SSRS report with parameters using X++
✔️ Render it as a PDF and convert it into a byte stream
✔️ Upload the output to Azure Blob Storage securely using the latest Azure SDK

    This approach enables automated, scalable, and secure report storage in the cloud, perfect for compliance, integrations, and downstream analytics.

Wednesday, June 11, 2025

Converting a System.IO.MemoryStream object to Image in Dynamics 365 Finance and Operations using x++

๐Ÿ“˜ Introduction

In Dynamics 365 Finance and Operations (D365FO), it's common to deal with external systems returning image streams, such as a barcode service, a document API, or a file server. In such cases, you may receive the image as a binary stream (System.IO.Stream), You’ll need to convert it into a container to display it in the UI or store it in the database.

This blog post explains how to convert a MemoryStream into a displayable image in D365FO using X++.

๐Ÿงช Sample Scenario

Let’s say you're calling an external URL using System.Net.HttpWebRequest and want to load the image from the response into a form control.

✅ X++ Code to Convert MemoryStream to Image

System.Net.HttpWebRequest request = System.Net.WebRequest::Create("https://example.com/sample-image.png") as System.Net.HttpWebRequest;

System.Net.HttpWebResponse response;
System.IO.Stream stream;
System.IO.MemoryStream fileStream = new System.IO.MemoryStream();

request.Method = "GET";
request.ContentType = "image/png";

response = request.GetResponse() as System.Net.HttpWebResponse;

using (response)
{
    stream = response.GetResponseStream();

    // Convert the stream into a memory stream
    stream.CopyTo(fileStream);

    // Convert memory stream to container
    BinData binData = new BinData();
    container baseContainer = Binary::constructFromMemoryStream(fileStream).getContainer();
    Image image = new Image();

     // Set container data to BinData
    binData.setData(baseContainer);

    // Set image data
    image.setData(binData.getData());

     // Display in image control (e.g., FormImageControl1)
    FormImageControl1.image(image);
}


๐Ÿ’ฌ Explanation

  • HttpWebRequest is used to call the external image URL.

  • ✅ The response stream is copied into a MemoryStream.

  • ✅ We use Binary::constructFromMemoryStream() to convert the stream into a container.

  • ✅ This container is then used to create an Image object.

  • ✅ Finally, we bind it to the form’s image control like FormImageControl1.

๐Ÿ’ก Use Case Ideas

  • Show customer logos dynamically.

  • Load barcode images from external services.

  • Display product thumbnails received via APIs.

๐Ÿง  Conclusion

Working with external image streams in D365FO using X++ becomes seamless when you properly convert the stream into a container and use BinData + Image objects. This approach is ideal for real-time image rendering from REST APIs or streaming sources.