Tuesday, August 29, 2023

Steps to Setting up a Cloud-Hosted Development Environment for D365 Finance and Operations using LCS

The blog post will showcase the process of creating a cloud-hosted environment using LCS and the Azure Portal, in a step-by-step manner.

Use this guide to create a personalized environment for learning or presale purposes.

Step 1: log in to the LCS.

Login to LCS using your credentials.

Step 2: Create a new project

Inside the recent project, click on the ‘+’ button to create a new project.

Step 3: Navigate to the 'Finance and Operation' section.


Step 4: Select the project purpose.

For learning or presale purposes, choose the first option when creating a project.

Step 5: Enter Project Details.

 Provide the project name, description, and industry on the following page. Click on the ‘Create’ button.


The project will be created once the create button is clicked. 

Step 6:  Login to Azure Portal

Access the Portal.Azure.com using your LCS login credentials.

Step 7: Click on the subscription.


Step 8: Select your active subscription.

Step 9: Choose the IAM and create a new role.


Step 10: Select the role assignment.


Step 11: Select the contributor for the role.


Step 12: Select the member tab.


Step 13: Click on the select Members.


Step 14: Select Dynamic Deployment Service.

To proceed, choose the 'Dynamic deployment service' option after clicking the 'Select Members' button.


Step 15: Review and Assign Role

Click on ‘Next’ and review the role. Once done, click on ‘Assign’. Your role is now created.

Return to LCS for the next step.

Step 16: Click on the burger icon.



Step 17: Select the project setting option.



Step 18: Click on the Azure connector and Add button.


Step 19: Enter Azure Connector Details

Specify a name for the Azure Connector, provide your Azure subscription ID, and enable the ARM toggle.

Note: Retrieve the Azure subscription ID by copying it from the Azure portal


Login to azure portal to copy your subscription name and ID

The Azure role assigned to you can be viewed on the following page.





Step 20: Certificate from LCS after receiving it.

Download management certificate

Step 21: To activate your Azure subscription, upload the certificate to the portal.


Step 22: Continue with the remaining steps of the LCS. Select the Azure region and connect.

Azure Connector is created now.


Step 23: To access the cloud-hosted environment, click on the burger icon and select it.

Step 24: Click on Add button.


Select the latest Application version.


Step 25: Select the environment topology.

For development purposes, choose DEVTEST; for demos, choose DEMO.

Select 26: Select Develop for development purposes.

Step 27: Specify the environment name and choose an appropriate VM size according to your needs.


Give your tier 1 machine a name and select a size. your can click on azure price list to check the price.

Click on deploy.

Once you click on the ‘Deploy’ button,

Step 28: The deployment shall begin and will now be in queued status

Step 29: Now in Deploying status



The Cloud-hosted environment will go to the deploying state. The deployment of the environment will take place in approximately 8-9 hours.

Important Note: Remember to set auto-shutdown for the VM after deployment and access the dev machine from LCS or Azure portal.

I hope that this manual will assist you in building your own cloud-hosted setup utilizing LCS and Azure Portal.

How to debug a non-development and a non-production environment In D365 F&O

This post outlines the steps how to debug an issue which is happening in non-development or non-production environments e.g. UAT environment.

Enable access for your IP address

Go to lcs ->sandbox environment page. To connect to test environment RDP, you need to create Whitelist rule for your IP address 



RDP to your environment where you want to debug the issue, with this example I am connecting to UAT (Sandbox). 

Connect to SQL server

a.     Get the server name from LCS database accounts

b.    Use .database.windows.net to connect to database

Create a new SQL sign-in that the developer can use. This step lets the system administrator maintain the security of the sandbox environment. The developer will have access to one database for only a limited time. Use the following code to create the new SQL sign-in.

 

CREATE USER devtempuser WITH PASSWORD = ‘pass@word1’
EXEC sp_addrolemember ‘db_owner’, ‘devtempuser’


Stop following services in development box

a.     IIS or world wide web publishing service development box

b.    Microsoft Dynamics 365 unified Operations: Batch Management service

Go to J:\AosService\WebRoot\web.config.

Save a copy of the original web.config file, so that you can switch back later. I copied file in the same directory and renamed the web.config to webDev.config and copied one to web.config

Edit the following section in the web.config file.

You can either comment original configuration or delete it and add new configuration of the sandbox environment


Start IIS or world wide web publishing service

Do not start batch service otherwise your batch jobs of development box will start writing into UAT environment. Be careful with this step!!!

Now, connect to your development box from browser. Oops!!! It does not connect and throw an error. Don’t worry let’s jump into event log and see what’s in there


Okay; so it is an access issue and here is the resolution for it.

 Whitelist your IP address

 Execute the following command against Master database in UAT database server. IP address you get from the event viewer.

Select new query against Master DB
exec sp_set_firewall_rule N’DEVNAME’, ‘IP’, ‘IP’ 


Note: Restart IIS and make sure application pool is started

Connect to development URL again from browser and this time it works. You can now access UAT database (all legal entities etc.)

Restart or your start your VS in development box and attach to Process w3wp.exe to troubleshoot the issue(s)

 

Done with your debugging and got the resolution 

 

Now it is time to remove devtempuser from UAT database, this prevents having the permanent access to the sandbox database. Right-click on the devtempuser user under Sabndoxdatabase | Security | Users and delete it.

 

Stop IIS.

Revert changes from the web.config file or simply delete the copied one and rename the original one from WebDev to Web.


Start IIS.


Start batch service 


Connect to development environment URL and make sure it is connected to development database.

Reference: 

https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/dev-tools/debugx-issue-against-copy-of-production






Monday, August 21, 2023

D365 FO: Method calling sequence of Data Entity


 
















My notes:

  1. The postLoad method is called also by import! Since postLoad is the recommended place to set values for the virtual fields, this potentially slow-down the import process unnecessarily.
  2. Be careful by using the postTargetProcess method! This method is called at the end of EACH thread/task-bundle if you are using the “Import threshold record count” option.
  3. Previously, you could add postTargetProcess only to a newly created entity, but now you can extend any entity using CoC
  4. Extension is simple.
[ExtensionOf(tableStr(DataEntity))]
final public class DataEntity_Extension
{
    public static void postTargetProcess(DMFDefinitionGroupExecution _dmfDefinitionGroupExecution)
    {
        // Do no call next
        //logic here
    }
}
        //logic here
    }
}
    }
}
}

Please note that this can be done only MDM scenarios but not via ODATA because ODATA updates, inserts records row by row and there is no post event\method to use. You might wish to use OData action to perform post actions on records.


SLOW OR FAST? PERFORMANCE CONSIDERATIONS FOR THE WHILE SELECT STATEMENT

During my career, I have had to deal many times with performance issues. There are many different causes for this. It can be related to hardware, software, configuration and more. In this post, I will explain two scenarios where a while select statement caused performance issues and how to mitigate them.

While select statement

With the while select statement in x++, you can interact with data in the SQL database. Data is being retrieved in a table buffer. The broader the select statement, the larger the table buffer will be and will need load and transfer times. Using supported clauses, you can limit the result set; both the number of records and number of columns. When you are using the where clause, it will limit the number of records, but can actually increase the loading time. This depends on the complexity of the select statement and if your search is supported with correct indexes. Typically, when doing performance investigations, we are used to having a look at missing indexes. This is not always the root cause for performance issues.

The word ‘while’ in while select indicates a loop. It will loop all records from the table buffer. If you have x++ statements as part of the loop which are time-consuming, then this needs a closer look. In some cases, a developer might have made another mistake that could lead to performance issues or is not aware of other statements which might be faster in execution times. I will now continue with my examples.

Selecting all records unintended

When data needs to be read or updated for e.g. a single customer, we usually take the next statement:

 

while select forUpdate myTable where myTable.CustAccount == _custAccount

{

    // do something...

}

 

This select statement looks innocent, but it isn’t. Assume that the indexing was defined properly. The text ‘_custAccount’ indicates it is provided with a parameter in the current method. If the value is e.g. ‘US-101’ it will loop the records correctly. The number of records will determine how much time is consumes by the ‘do something’ part.

In several cases, I had a statement like this which caused huge performance issues, found with SQL header block information, Trace Parser analysis and/or debugging. Actually it was caused by missing data; the where clause variable was empty. In the example above, the _custAccount variable was empty. In that case, it is returning all records.

Mitigation

It would be recommended to first evaluate if the variable has a certain value before actually using it in the while select statement.

Compare it with a division by zero which should be prevented. The example above can be easily adjusted to prevent the performance issue like this:

 

if (_custAccount)

{

    while select forUpdate myTable where myTable.CustAccount == _custAccount

    {

        // do something...

    }

}

Or when this is the only statement in your method, you can also use:

 

if (_custAccount == '')

{

    return;

}

 

while select forUpdate myTable where myTable.CustAccount == _custAccount

{

    // do something...

}

From my experience, in two cases, there were multiple nested while select loops where in between certain fields of one or more records did not have a value. The outcome of a field in the first while select iteration was used for the where clause. This caused looping through all records of 4 different tables. When it was read only, then only the current user was affected. In update scenarios, like provided in the example code, it will actually cause a database lock which could have a duration of several minutes. In your imagination, think of having this as part of a sales invoice update and during end month process when multiple legal entities needs to complete invoicing it will be a complete business stand still due to database locks.

Prevent nested loops

During the build, the best practices will warn you for nested loops. You can bypass the warnings by having the while select of the second level in a separate method, but you can also think of another approach to improve performance.

Suppose, we have multiple tables with data that needs to be copied, for example as new version or from worksheet tables to transaction tables. It is about a Header, Line and Details. Let’s draw an easy example:

Header

ID

Description

RecId

AAA

Description AAA

101

BBB

Description BBB

102

CCC

Description CCC

103

Line

HeaderRefRecId

Group

RecId

101

Accountant

201

101

Sales clerk

202

101

Sales manager

203

102

Controller

204

Detail

LineRefRecId

User

RecId

201

User 1

301

201

User 2

302

202

User 3

303

202

User 1

304

203

User 4

305

203

User 5

306

204

User 4

307

Suppose we need to copy the Lines with Details from Header AAA to a new Header: CCC. In the first table, the record for CCC is already created. It is quite normal to think that we need to loop the old data and per line copy the Details. The coding could look like this:

while select lineSource where lineSource.HeaderRefRecId == _headerRecIdSource

{

    line.initValue();

    line.HeaderRefRecId = _headerRecIdNew;

    line.Group          = lineSource.Group;

    line.insert();

 

    while select detailSource where detailSource.lineRefRecId == lineSource.RecId

    {

        detail.initValue();

        detail.LineRefRecId = line.RecId;

        detail.User         = detailSource.User;

        detail.insert();

   }

}

This code example will work and be a viable solution if there are not much records expected in the Line and Detail tables. The more details and line, the less performant this option will be. When having a large number of records, your aim should be to limit the execution time for the copy process. The statements insert_recordsetupdate_recordset and delete_from are handling multiple records at the same time and are much faster as there will be only one SQL call involved. As the execution times are quicker using these statements, it is not always the best to use them. I would recommend reading the blog from Denis Trunin about Blocking in D365FO for more understanding.

Now how to use this for the example above? The trick is to add a dummy field for the record ID of the Line table. This will initially have no value (0 for integer64).

Line

HeaderRefRecId

Group

TemporaryRecId

RecId

101

Accountant

0

201

101

Sales clerk

0

202

101

Sales manager

0

203

102

Controller

0

204

With help of this field, we can now copy the Line and Details with the next x++ coding.

insert_recordset line (HeaderRefRecId Group, TemporaryRecId)

    select _headerRecIdNew, Group, RecId from lineSource

        where lineSource.RecId == _headerRecIdSource;

 

insert_recordset detail (User, LineRefRecId)

    select User from detailSource

    join RecId from line

        where line.TemporaryRecId == detailsSource.LineRefRecId

           && line.HeaderRefRecId == _headerRecIdNew;

This coding is returning the next new records in the tables:

Line

HeaderRefRecId

Group

TemporaryRecId

RecId

103

Accountant

201

205

103

Sales clerk

202

206

103

Sales manager

203

207

After the first insert_recordset command, the new Line records has all the information to copy the several lines in one go. The TemporaryRecId values are used to link with the source detail records. In addition, the new created RecId can be used as detail reference record Id.

Detail

LineRefRecId

User

RecId

205

User 1

307

205

User 2

308

206

User 3

309

206

User 1

310

207

User 4

311

207

User 5

312

As we don’t need the TemporaryRecId values after the copy process, we can clear the information; also with a single SQL call.

 

update_recordset line setting TemporaryRecId = 0

    where line.HeaderRefRecId == _headerRecIdNew;

In a certain process, the first approach took almost a minute to copy larger record sets. After replacing this with the insert_recordset approach, the execution time was brought back to just a few seconds.