Tuesday, July 5, 2022

D365 How to configure 10.0.24 VM - Error Admin Provisioning Tool Error: The value’s length for key ‘password’ exceeds it’s limit of ‘128’

 

Admin Provisioning Tool Error: The value’s length for key ‘password’ exceeds it’s limit of ‘128’



Most of the time, I usually run Dynamics 365 Finance and Operations vm on my personal Laptop for R&D purposes.

But after downloading latest 10.0.24 vm from LCS. I am unable to run , admin User Provisioning tool. I got following error.

The value’s length for key ‘password’ exceeds it’s limit of ‘128’.

Later on LinkedIn Microsoft Employee Volker Deuss shared a link which helps me resolve this issue. Here are my notes.

Now Microsoft requires you to create one web application register in Azure portal.

And before your provide your user as provision as admin, you have to perform one more step.

Generate Self-Signed Certificates and here we use this application id. It configure some certificates.

And interestingly your own azure portal without tenant is not work. You have to create web application. in your tenant azure subscription. It will not work like my personal email. even I purchase azure subscription on my personal Id xxxxx@live.com    And web application register there is not work.

So here are some steps to create register application in following steps.

open your azure subscription. and type in search box on top register app

Click on new registration.

Enter the name

Enter local Dynamics url. We used this link inside the Vm.

Also select web in first drop down. Here we need to create a web

Enter name and in

add one more redirect.

Click on Redirect URIs and add one more.

Save it a

Now copy the application id created and go to Dynamics vm

and now go to Dynamics VM. and right click on Generate Self Signed Certificates.

Next step Press N. and let it process , it will restart the services, Batch job. When this job is completed. You can register same domain id as admin Provision

And my id is become successful admin

And successful, run Ax 10.0.24 on my local laptop.



Reference 

Sunday, July 3, 2022

D365FO Data lake with Azure Data Factory

 If you are planning to implement the Azure data lake feature in D365FO, and do not want to use the azure synapse for any reason, then this post will give you a quick start in implementing the solution. 

The solution does have the capabilities to make full and incremental load with multi-threading using a modular approach.

In the future, you can use the same ADF solution in Azure synapse.

In the following solution, we are loading the data from the Azure data lake to the Azure SQL (Physically) using the Azure Data Factory, We make customization to generate the table's constraint to make sure that we will have the same performance at Azure SQL.





Following are the step that helps you to incorporate the provided solution in your implementation.

The solution has 3 parts including its source code.

  • D365FO 
  • SQL Server 
  • Azure Data Factory 

D365FO


As a first step, you need to configure the Azure data lake with Dynamics 365 Finance & Operations Link.

Once the data lake is configured successfully, then please import our customization package, you can download the file using this link, and additional link


After successful Import, please create the menu item for [AVIBlobStorageAccount] form, and set the navigation of the form according to your easiness.

As a pre-requisite, please navigate to that form, and provide the following information.
  1. DataLake Storage Account Name.
  2. Account key for access.
  3. Click on data lake to enable the checkbox.

Reference screenshot



All the above information will be utilized by our customization to generate and upload the file to the Azure Data Lake container.

Once all the prerequisites are completed, then activate your required tables.

Reference screenshot. 

  







Once all tables are in a running state then select all of them, and click on the highlighted menu item.

This is the customization I have done to fulfill my requirements to have the schema of a complete table including primary keys and indexes. 

The second step of the activity will generate the CSV file that will be used in Azure Data Factory and will have the complete folder hierarchy path against every table 

Reference screenshot





Before execution of our customization, you will see the following OOTB folders in your data lake. Please execute the customization in batch mode.

Reference screenshot
After execution of our customization, You will see two additional folders.

  • SQLScript.
  • EntityMapping
Reference screenshot.

SQLScript

As I mentioned earlier that the folder contains the SQL table schema that will be used by the Azure data factory to create the tables on runtime if doesn't exist.

File preview




EntityMapping


The entity mapping file will also be used by the Azure data factory for reading the CSV file using a common data model connector.

File preview




Now the Dynamics 365 Finance & Operations side activities are completed.

Let's jump to the next step.

SQL Server 

At the SQL server-side, we have only the following 3 steps.
  • Create Azure SQL database
  • Create a new Schema with the Name of [DL], as it is being used in our complete solution.
  • Create the following Store procedure in your database. You can download the store procedure using this link.
Important
  • As we are using Data flow with a Common data connector that doesn't support the self-hosted integration runtime, so please use the Azure integration runtime.
  • Some Dimension-related tables have an index on the Hash key, and Microsoft doesn't export the Hash information, so you need to disable those indexes manually otherwise may face the index violation issue.

I would recommend having a look at the store procedure for technical understanding, You can modify them according to your requirements.

Azure Data Factory 

I would recommend having a look at the prerequisite and completing them before importing our Azure Data Factory source code as a template by using this link.

Pre-Requisite


Linked Services.

Create two linked services in your Azure data factory. One for Azure SQL communication, and the Second for Azure data lake communication via Azure Data Factory.

Now, You can import the Azure Data factory template to your data factory.

Process Flow Diagram

Below is the Azure data factory flow diagram that will be used in both types of data load (Full & incremental)


Steps

In the first step, the pipeline sets the current execution time into a variable.
In the second step, a separate activity "SequencePipeLine" is called within the main activity.

SequencePipeLine

    In the sequence pipeline activity, the azure data factory loads the file of sequences from the data lake (from the SQLScript folder) that is being used in all schema tables and will generate them into the SQL database.



In the third step, another separate activity "SQLDatabaseSchema" is called within the main activity.

SQLDatabaseSchema

           In the Database schema pipeline activity, the azure data factory loads the files of tables schema from the data lake (from the SQLScript folder) and will create the table, keys, and indexes if not exists database.

         The [EntitiesDictionary] part is calling a store procedure that will create two additional tables in your database and load one of them with all the table's names and with the default last sync date 01-01-1900.

        The Last part of this pipeline will load the manifest parameter file from the Azure data lake to your database that we generated during D365FO activities.




In the next step, the main activity executing two parallel pipelines [FullLoadPipeLine] & [IncrementalPipeLine].

FullLoadPipeLine & IncrementalPipeLine

In both pipelines, three parallel activities are called, and all of them are calling the store procedure based on different parameter values and processing the different sets of tables.




A data flow activity is called in every for each loop activity, the data flow has a few parameters like below that need to provide, and all this information you will get as an object in your for-each loop's iteration.

after the data flow, the next step is to mark the last execution in our directory table.





Let's discuss the data flow here.

We have two separate data flows for full load and for incremental.

In both data flows, we have two additional steps for transformation, derived column, and selection of records, but in the incremental data flow, there are 3 additional steps to manage the change tracking and avoid the duplication of the records. 

I would suggest having a look at both data flow for a better understanding. 

Reference screenshot of full load data flow



Reference screenshot of incremental load data flow





Important

In this complete solution of the Azure data factory, the following features are not included.

  • Retry logic 
    • The retry logic is important because you can have a connectivity issue for any technical reason.
  • Email notification 

Downloadable links



D365FO & AX 2012 - Data Source Join Types

 For Instance, We Created Two Tables.



  • Student
  • Student Attendance


Then Create a relation as per your requirement. I am using Normal relations.

Student

Student Attendance





Now create New Form and Add two Gird [StudentGrid] & [StudentAttendanceGird]




Let's begin


Passive Join Type




Passive form data source link type won't update the child data source automatically. For example, if we select the parent table order then order details child data source won't update. If we need to update the child data source we need to call the child data source to execute the query method by the program (code).


Active Join Type




Active link type updates the child data sources without any delay when you select the parent table record. When you deal with more records it will affect application performance.

Delay Join Type



Delay form data source link type is also same as active method the different is delay method won't update immediately when you select the parent record. It will update the child data source when you select the parent table, Ax uses a pause statement before update the child data source. 


Inner join Type



Inner join form data source link type displays the rows that match with parent table and child table. 


Outer join Type


Outer join form data source link type will return all parent records and matched child records. It will return all rows in the parent table. 

Exists Join Type



Exist join form data source link type return matched rows of the parent table. It behaves like an inner join but the difference is once the parent row is matched with child records then stops the process and updates in the grid, Ax won't consider how many records are in the child table for the parent row.

Not Exists Join Type



Not exist join form data source link type is a totally opposite method to exist join. It will return the not-matched parent records with child records.