Translate

Friday, June 27, 2025

📊 How to Split Financial Dimensions into Columns Using SQL in D365 F&O

In Microsoft Dynamics 365 Finance and Operations, financial dimensions are stored in a compressed form using fields like LedgerDimension and DefaultDimension. While this is great for system efficiency, it’s not very human-readable—especially when you're running queries or building reports directly from the database.

In this article, I'll show you how to split LedgerDimension and DefaultDimension into individual columns using SQL, so that each dimension (like CostCenter, ItemGroup, etc.) appears in its own field. This is very useful when building reports, integrating with Power BI, or analyzing data directly from SQL Server.

🧩 1. Split Ledger Dimension into Individual Columns

Use this query when you want to extract Main Account + Dimensions (i.e., full ledger dimension combination used in transactions):

SELECT
    Dimension,
    MainAccount,
    Claims,
    Functions,
    ItemGroup,
    CostCenter,
    Market_IC,
    MTO,
    Projects,
    Vendor,
    Agreement
FROM
(
    SELECT
        DAVC.RecId AS Dimension,
        DA.Name,
        DALVAV.DisplayValue
    FROM
        DimensionAttributeValueCombination AS DAVC
        JOIN DimensionAttributeLevelValueAllView AS DALVAV
            ON DALVAV.VALUECOMBINATIONRECID = DAVC.RecId
        JOIN DimensionAttribute AS DA
            ON DA.RecId = DALVAV.DIMENSIONATTRIBUTE
    WHERE
        DAVC.MainAccount != 0
) AS SourceData
PIVOT
(
    MAX(DisplayValue)
    FOR Name IN (
        MainAccount, Claims, Functions, ItemGroup, CostCenter,
        Market_IC, MTO, Projects, Vendor, Agreement
    )
) AS PivotTable;

 🔍 Note: Replace the dimension names in the IN (...) clause with the actual dimension names configured in your environment.


📁 2. Split Default Dimension into Individual Columns

Use this query when you're dealing with DefaultDimension (used in master records like customers, vendors, items, etc.).

SELECT
    DefaultDimension,
    Claims,
    Functions,
    ItemGroup,
    CostCenter,
    Market_IC,
    MTO,
    Projects,
    Vendor,
    Agreement
FROM
(
    SELECT
        DAVC.RecId AS DefaultDimension,
        DA.Name,
        DAVSI.DisplayValue
    FROM
        DimensionAttributeValueCombination AS DAVC
        JOIN DimensionAttributeValueSet AS DAVS
            ON DAVS.RecId = DAVC.RecId
        JOIN DimensionAttributeValueSetItem AS DAVSI
            ON DAVSI.DIMENSIONATTRIBUTEVALUESET = DAVS.RecID
        JOIN DimensionAttributeValue AS DAV
            ON DAV.RecId = DAVSI.DIMENSIONATTRIBUTEVALUE
        JOIN DimensionAttribute AS DA
            ON DA.RecId = DAV.DIMENSIONATTRIBUTE
    WHERE
        DAVC.MainAccount = 0
) AS SourceData
PIVOT
(
    MAX(DisplayValue)
    FOR Name IN (
        Claims, Functions, ItemGroup, CostCenter,
        Market_IC, MTO, Projects, Vendor, Agreement
    )
) AS PivotTable;
 

💡 Use Cases

  • Exporting transaction data for reporting

  • Creating simplified views for Power BI

  • Debugging the ledger and default dimensions

  • Building a staging table for the data warehouse

✅ Tips

  • These queries are meant for read-only reporting purposes.

  • Use them in your Data Lake, BYOD, or on-prem reporting environments.

  • Ensure dimension names match your environment’s configuration.

🔗 Conclusion

Splitting LedgerDimension and DefaultDimension into readable columns using SQL is a powerful technique to simplify financial data analysis. This is especially helpful for technical consultants, BI developers, and finance teams working on reporting, integrations, or validations.

Let me know in the comments if you'd like a downloadable view or Power BI model version of this!

Happy coding! 💻


🔍 Stay tuned for more D365FO technical posts

📌 Bookmark: https://d365fohunt.blogspot.com

☁️ Execute D365 F&O SSRS Report with Parameters and Upload Report Output to Azure Blob Storage Using X++

In many real-world scenarios, businesses need to automatically generate SSRS reports from Dynamics 365 Finance & Operations (D365FO) and store them externally for auditing, sharing, or integration purposes. This blog walks you through how to execute an SSRS report with parameters, convert it into a PDF byte stream, and upload the output to Azure Blob Storage using X++ and the latest Azure SDK.

In this article, we'll walk through how to:

  • Execute a parameterised SSRS report using X++

  • Render the report as a PDF byte stream

  • Upload the output to Azure Blob Storage using the latest Azure SDK (Azure.Storage.Blobs)

✅ Why Use This?

Uploading reports to Azure Blob Storage allows you to:

  • Automate report delivery

  • Store large reports securely

  • Integrate with Power BI, Logic Apps, or third-party APIs

  • Replace manual downloads or email-based report distribution

✅ Steps to Achieve This

  1. Create a controller class to run the SSRS report with parameters.

  2. Create an Azure Storage Account in your Azure subscription.

  3. Create Blob containers and optional folders to organise your report files.

  4. Get the connection string from Azure to authenticate from X++.

  5. Render the SSRS report to a byte stream in PDF format.

  6. Upload the stream to Azure Blob Storage using Azure.Storage.Blobs.

🛠 Prerequisites

  1. Add references to these .NET assemblies:
    • Azure.Storage.Blobs
    • System.IO
  2. Ensure your storage account connection string and container are correctly configured.

🌐 How to Set Up Azure Storage for Report Uploads

Follow these steps in the Azure Portal to create the required storage setup:

🔹 Step 1: Log in

Visit https://portal.azure.com and sign in with your Azure credentials.

🔹 Step 2: Create Storage Account

  • Navigate to Storage accounts

  • Click + Create

  • Fill in details: Subscription, Resource Group, Name, Region

  • Choose Standard performance and Hot access tier (default)

🔹 Step 3: Create Blob Container

  • Once the storage account is created, go to it

  • Select Storage browser from the left menu

  • Click on Blob containers

  • Create a new container (e.g., reportupload)

    • Set access level to Private (recommended for security)

🔹 Step 4: (Optional) Create Folders

  • Open your container

  • Click + Add directory to create folders like ReportUpload

🔹 Step 5: Get the Connection String

  • Go to the Access keys section under Security + networking

  • Copy the Connection string of key1

    • Example format:

(DefaultEndpointsProtocol=https;AccountName=youraccount;AccountKey=yourkey;EndpointSuffix=core.windows.net)

🔄 What's Changed?

The legacy WindowsAzure.Storage package is deprecated. The new library:

  • Uses Azure.Storage.Blobs.

  • Has updated method names and authentication mechanisms.

  • It is available via NuGet: Azure.Storage.Blobs

💻 X++ Code with Latest SDK

using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Specialized;
using System.IO;
 
public class ExampleUploadReportToBlobStorage
{
    public static void main(Args _args)
    {
        ExampleUploadReportToBlobStorage uploadObj = new ExampleUploadReportToBlobStorage();
        BlobContainerClient containerClient = uploadObj.connectToAzureBlob();
        uploadObj.uploadFileToBlob(containerClient);
    }
 
    public BlobContainerClient connectToAzureBlob()
    {
        str connectionString =
            "DefaultEndpointsProtocol=https;AccountName=youraccountname;AccountKey=yourkey;EndpointSuffix=core.windows.net";
 
        BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
        BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("reportupload");
 
        info(strFmt("Connected to Azure Blob Container: %1", containerClient.getName()));
        return containerClient;
    }
 
    public void uploadFileToBlob(BlobContainerClient containerClient)
    {
        System.Byte[] reportBytes = ExampleUploadReportToBlobStorage::getSSRSBytes("ReportOutput");
 
        if (reportBytes)
        {
            str blobPath = "ReportUpload/UploadFile.pdf"; // folder + filename
            BlobClient blobClient = containerClient.GetBlobClient(blobPath);
 
            System.IO.MemoryStream stream = new System.IO.MemoryStream(reportBytes);
            blobClient.Upload(stream, true); // Overwrite if file exists
 
            info("File uploaded successfully to Azure Blob Storage.");
        }
    }
 
    public static System.Byte[] getSSRSBytes(str _fileName)
    {
        SrsReportRunController controller = new SrsReportRunController();
        SRSPrintDestinationSettings settings;
        System.Byte[] reportBytes;
        SRSProxy srsProxy;
        SRSReportRunService srsService = new SRSReportRunService();
        Map paramMap;
        SRSReportExecutionInfo execInfo = new SRSReportExecutionInfo();
 
        controller.parmReportName(ssrsReportStr(TransactionsReport, Report));
        controller.parmShowDialog(false);
        controller.parmLoadFromSysLastValue(false);
 
        // Set report parameters
        ReportParametersDataContract contract = controller.parmReportContract().parmRdpContract() as ReportParametersDataContract;
        contract.parmFromDate(today() - 5);
        contract.parmToDate(today());
 
        // Configure output
        settings = controller.parmReportContract().parmPrintSettings();
        settings.printMediumType(SRSPrintMediumType::File);
        settings.fileName(_fileName + ".pdf");
        settings.fileFormat(SRSReportFileFormat::PDF);
 
        controller.parmReportContract().parmReportServerConfig(SRSConfiguration::getDefaultServerConfiguration());
        controller.parmReportContract().parmReportExecutionInfo(execInfo);
 
        srsService.getReportDataContract(controller.parmReportContract().parmReportName());
        srsService.preRunReport(controller.parmReportContract());
 
        paramMap = srsService.createParamMapFromContract(controller.parmReportContract());
        Microsoft.Dynamics.AX.Framework.Reporting.Shared.ReportingService.ParameterValue[] paramArray =
            SrsReportRunUtil::getParameterValueArray(paramMap);
 
        srsProxy = SRSProxy::constructWithConfiguration(controller.parmReportContract().parmReportServerConfig());
 
        reportBytes = srsProxy.renderReportToByteArray(
            controller.parmReportContract().parmReportPath(),
            paramArray,
            settings.fileFormat(),
            settings.deviceinfo());
 
        return reportBytes;
    }
}

🔐 Security Tips

  • Avoid hardcoding the connection string in production. Use Key Vault or Azure App Configuration.

  • Always restrict access to the container using RBAC or SAS tokens instead of full account keys.

  • Set your Blob container access level to Private unless you explicitly need public access.

  • Use Shared Access Signatures (SAS) or Managed Identities in production scenarios.

📌 Business Use Cases

This solution is ideal for:

  • Automatically exporting and archiving daily/weekly/monthly reports

  • Sharing output with external systems or teams via Azure Storage

  • Generating custom reports on demand and storing them securely in the cloud

✅ Summary

In this post, we’ve seen how to:

✔️ Execute an SSRS report with parameters using X++
✔️ Render it as a PDF and convert it into a byte stream
✔️ Upload the output to Azure Blob Storage securely using the latest Azure SDK

    This approach enables automated, scalable, and secure report storage in the cloud, perfect for compliance, integrations, and downstream analytics.

Wednesday, June 11, 2025

Converting a System.IO.MemoryStream object to Image in Dynamics 365 Finance and Operations using x++

📘 Introduction

In Dynamics 365 Finance and Operations (D365FO), it's common to deal with external systems returning image streams, such as a barcode service, a document API, or a file server. In such cases, you may receive the image as a binary stream (System.IO.Stream), You’ll need to convert it into a container to display it in the UI or store it in the database.

This blog post explains how to convert a MemoryStream into a displayable image in D365FO using X++.

🧪 Sample Scenario

Let’s say you're calling an external URL using System.Net.HttpWebRequest and want to load the image from the response into a form control.

✅ X++ Code to Convert MemoryStream to Image

System.Net.HttpWebRequest request = System.Net.WebRequest::Create("https://example.com/sample-image.png") as System.Net.HttpWebRequest;

System.Net.HttpWebResponse response;
System.IO.Stream stream;
System.IO.MemoryStream fileStream = new System.IO.MemoryStream();

request.Method = "GET";
request.ContentType = "image/png";

response = request.GetResponse() as System.Net.HttpWebResponse;

using (response)
{
    stream = response.GetResponseStream();

    // Convert the stream into a memory stream
    stream.CopyTo(fileStream);

    // Convert memory stream to container
    BinData binData = new BinData();
    container baseContainer = Binary::constructFromMemoryStream(fileStream).getContainer();
    Image image = new Image();

     // Set container data to BinData
    binData.setData(baseContainer);

    // Set image data
    image.setData(binData.getData());

     // Display in image control (e.g., FormImageControl1)
    FormImageControl1.image(image);
}


💬 Explanation

  • HttpWebRequest is used to call the external image URL.

  • ✅ The response stream is copied into a MemoryStream.

  • ✅ We use Binary::constructFromMemoryStream() to convert the stream into a container.

  • ✅ This container is then used to create an Image object.

  • ✅ Finally, we bind it to the form’s image control like FormImageControl1.

💡 Use Case Ideas

  • Show customer logos dynamically.

  • Load barcode images from external services.

  • Display product thumbnails received via APIs.

🧠 Conclusion

Working with external image streams in D365FO using X++ becomes seamless when you properly convert the stream into a container and use BinData + Image objects. This approach is ideal for real-time image rendering from REST APIs or streaming sources.


🧩 What is Microsoft Power Platform and Can It Be Used in D365 Finance and Operations?

🔍 Introduction
In the world of enterprise solutions, agility and automation are key. Microsoft Power Platform is a powerful suite of tools that complements Dynamics 365 Finance and Operations (D365 F&O) to help streamline business processes, build apps quickly, and gain actionable insights — all with minimal coding.

⚙️ What is Microsoft Power Platform?

Microsoft Power Platform is a collection of low-code/no-code tools that help users analyse data, automate workflows, and build custom apps or chatbots. 

 It consists of the following four core components:

  1. Power BI:

    A powerful business analytics tool that allows users to visualise and analyse data. With Power BI, you can create interactive reports and dashboards to make data-driven decisions. It connects to multiple data sources, including cloud-based and on-premises databases.

  2. Power Apps:

    A platform for creating custom applications with minimal coding. Power Apps enables users to build applications tailored to their specific needs, whether for mobile, desktop, or web. It’s useful for automating tasks, managing data, and interacting with business systems like Dynamics 365.

  3. Power Automate:

    A tool to automate repetitive tasks and workflows. With Power Automate, you can create workflows that connect various applications and services. It allows businesses to automate processes like approvals, notifications, data collection, and task management, saving time and reducing human error.

  4. Power Virtual Agents:

    A platform for building intelligent chatbots without writing code. These chatbots can assist users in answering questions, guiding them through workflows, and automating customer support tasks. Power Virtual Agents integrates easily with other parts of the Power Platform and business systems.

    Key Benefits of Power Platform:

    • Low-Code/No-Code: Allows both technical and non-technical users to build solutions.
    • Automation: Helps automate repetitive tasks, improving efficiency and productivity.
    • Integration: Seamlessly integrates with Microsoft products like Dynamics 365, SharePoint, and Azure, as well as third-party services.
    • Custom Solutions: Allows businesses to create tailored applications and workflows without needing a developer.

💡 Power Platform Can Be Used with D365 Finance and Operations

Microsoft Power Platform can be integrated directly with Dynamics 365 Finance and Operations (D365 F&O) to enhance functionality, streamline workflows, and provide greater business intelligence. Below is a detailed breakdown of how each Power Platform tool can be integrated with D365 F&O:

1. Power BI and D365 F&O Integration

Power BI can be used with D365 F&O to deliver powerful business analytics and visualisations. The integration helps you analyse real-time data, create interactive reports, and build dashboards that are easy to interpret.

How Power BI Integrates:

  • Embedded Reports and Dashboards: You can embed Power BI reports and dashboards directly into D365 F&O interfaces. This allows you to display key metrics, KPIs, and data insights within the same platform, offering users a seamless experience.
  • Data Connections: Power BI can connect directly to D365 F&O’s data entities via OData (Open Data Protocol). This enables pulling data from tables like sales orders, inventory, financial transactions, etc., and analysing it in real-time.
  • Pre-built Content: Microsoft provides pre-built Power BI content packs for D365 F&O, which are ready-made reports that can be used for common analytics like financial performance, inventory levels, or order fulfilment.

Benefits:

  • Real-time reporting and data visualisation.
  • Decision-makers can make data-driven choices based on the latest information.
  • Visualise large amounts of data in an accessible way.

2. Power Apps and D365 F&O Integration

Power Apps allows you to build custom applications that interact with D365 F&O data. These applications can range from simple mobile apps to complex enterprise-level solutions, depending on the requirements.

How Power Apps Integrates:

  • Custom Apps: You can create custom Power Apps that pull and push data from D365 F&O. For example, you can build apps for sales teams to track customer orders or for employees to submit expense reports.
  • Data Entities: Power Apps can interact with D365 F&O via Data Entities (predefined structures that represent D365 F&O data). Data entities expose information from D365 F&O, making it easier to use that data in Power Apps.
  • Dynamics 365 Connectors: Microsoft provides out-of-the-box connectors that allow Power Apps to connect directly to Dynamics 365 applications, including F&O. This simplifies the process of interacting with D365 F&O data.
  • Common Data Service (Dataverse): If you're using Microsoft Dataverse (formerly Common Data Service), it can act as a unified data store that bridges Power Apps and D365 F&O. Data from D365 F&O can be stored in Dataverse, and apps can pull data from there.

Benefits:

  • Custom applications for different business processes (e.g., mobile apps for field service or inventory tracking).
  • Real-time data integration with D365 F&O for more efficient business operations.
  • No extensive coding required, making it accessible to non-developers.

3. Power Automate and D365 F&O Integration

Power Automate is a tool for automating workflows and tasks. It can integrate directly with D365 F&O to automate repetitive tasks, streamline processes, and integrate with other applications and services.

How Power Automate Integrates:

  • Automated Workflows: Power Automate can trigger workflows based on events in D365 F&O, such as when a sales order is placed, when inventory levels drop below a certain threshold, or when a financial transaction is recorded. For example, an automated workflow could send a notification to the purchasing team when an inventory item is low.
  • Connectors: Power Automate has pre-built connectors for D365 F&O, making it easy to integrate and automate processes without requiring custom development. These connectors allow actions like creating new records, updating data, or triggering alerts.
  • Cross-Application Workflows: Power Automate can also automate workflows that span multiple systems. For example, you can trigger a workflow to update customer information in both D365 F&O and Microsoft Dynamics 365 Customer Service, or automate approval processes between D365 F&O and Office 365.

Benefits:

  • Streamlined operations by automating manual tasks.
  • Integration with external systems and apps beyond D365 F&O, such as email, SharePoint, and third-party services.
  • Time-saving and reduced human error by automating routine tasks like approval workflows, notifications, and data updates.

4. Power Virtual Agents and D365 F&O Integration

Power Virtual Agents is a tool for building intelligent chatbots that can assist users with various tasks. These bots can be integrated with D365 F&O to automate customer service, help employees navigate business processes, or provide data insights.

How Power Virtual Agents Integrates:

  • Chatbots for Customer Support: A chatbot can be created to answer frequently asked questions about D365 F&O processes, such as checking inventory, retrieving order statuses, or explaining financial reports. The bot can query data directly from D365 F&O using APIs or Power Automate connectors.
  • Internal Assistance: Chatbots can also assist internal users, such as employees looking for information about company processes, financial data, or production statuses. A chatbot can help guide users through workflows, such as submitting purchase requests or tracking shipment statuses.
  • Integration with Power Automate: Power Virtual Agents can trigger workflows created in Power Automate. For example, when a user asks the chatbot to check the inventory level of a product, the bot can use Power Automate to fetch the data from D365 F&O and return the result to the user.
  • Integration with D365 Data: Through APIs or Dataverse, chatbots can pull real-time data from D365 F&O to provide users with the most up-to-date information.

Benefits:

  • Enhanced customer experience with 24/7 chatbot support.
  • Improved employee productivity with a bot that can guide users through internal processes.
  • Reduced manual intervention for routine queries and tasks.

Summary of Key Integrations:

  • Power BI: Pulls data from D365 F&O for reporting and analysis.
  • Power Apps: Allows users to create custom apps that interact with D365 F&O data for specific tasks like order tracking or expense management.
  • Power Automate: Automates workflows and tasks within D365 F&O and connects to other systems to streamline processes.
  • Power Virtual Agents: Creates intelligent chatbots that interact with D365 F&O data to assist users with queries or tasks.

By integrating Power Platform tools with D365 F&O, businesses can increase operational efficiency, automate workflows, make data-driven decisions, and enhance user experiences both internally and externally.

Key Benefits of Using Power Platform with D365 F&O:

  • Customisation: Tailor D365 F&O functionality to specific business needs without requiring extensive development expertise.
  • Automation: Streamline business processes and reduce manual work through automated workflows.
  • Data-Driven Insights: Leverage Power BI for advanced data visualisation and real-time business analytics.
  • Enhanced User Experience: Use Power Apps to create user-friendly applications, improving workflow efficiency.
  • Seamless Integration: Power Platform integrates seamlessly with D365 F&O, enhancing the overall Microsoft ecosystem.
In summary, Power Platform significantly enhances the capabilities of D365 F&O, providing users with tools to automate processes, build custom applications, generate powerful insights, and create intelligent agents.

🛠️ Technical Considerations

Authentication: Use Azure AD app registrations.
Performance: Be mindful when connecting to F&O via OData – it’s not optimised for heavy data loads.
Licensing: Power Platform licenses may be needed beyond D365 licenses.