DMF D365 package REST API Finance and operations – Import

I know many people has covered DMF API in detail but here am going to discuss about some additional REST API version which is not covered in Microsoft standard document .

I will start with Import API in first two blogs.

Best example to see https://learn.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/migration-upgrade/data-migration-tool which has used most of import REST API

So there are total 5 import API has been provided by Microsoft to help customer to integration third part solution or data migration.

1. DataManagementDefinitionGroups-ImportFromPackage

2. DataManagementDefinitionGroups-ImportFromPackageAsync

3. DataManagementDefinitionGroups-ImportFromDMTPackage.

4. DataManagementTemplates-ImportTemplateFromPackage

5. DataManagementTemplates-ImportTemplateFromPackageIgnoreMissingEntities

So lets divide this 5 API in two category.

  1. Import data (import data is used to import your Transaction and Master data)
  2. Import Template (Import Template help to create or update template for sequential data import for migration)

Let’s start with basic flow to import data in using Package API.

1. DataManagementDefinitionGroups-ImportFromPackage

To start with import you need to create data project in D365.

Once you are done with creating data project, Need to download package and grab manifest and Package header file to embed with data file to create new zip file.

Upload same package to your input location which you are going to listen in flow.

Next step would be to get Azure writeable URL to upload Package zip file .

To do so you can use “DataManagementDefinitionGroups-GetAzureWriteUrl” API.

This API will return Blob URL and GUID so you can use this URL to upload Package file.

Expression to Parse JSON output of getAzureWriteUrl API

{
    "type": "object",
    "properties": {
        "type": {
            "type": "string"
        },
        "properties": {
            "type": "object",
            "properties": {
                "BlobId": {
                    "type": "object",
                    "properties": {
                        "type": {
                            "type": "string"
                        }
                    }
                },
                "BlobUrl": {
                    "type": "object",
                    "properties": {
                        "type": {
                            "type": "string"
                        }
                    }
                }
            }
        }
    }
}

next step is upload package file to blob URL using http PUT request.

Once you are done with upload part, you are now ready to call first import API.

So importFromPackage API need 6 arguments.

  1. Package URL : Blob url which we have used to upload zip file.
  2. DefinitionGroupId : Import data project name from D365.
  3. Execute : This has been be Yes. (No if you just want to upload and keep execution waiting) — This is again subjected to requirement.
  4. Overwrite : If you want to override old import file.
  5. legalEntity : You need to mention which legal entity, you want to import data.
  6. Execution Id : This is option and not mandatory, Only use if you want to keep different execution Id for integration and manual process.

Output of this API will be execution id which can be used in next API call.

2. DataManagementDefinitionGroups-ImportFromPackageAsync

Now let’s talk about second import API. This is Async API design to handle large volume of data

This API access three additional parameters.

FailOnError : This parameters helps you to control if you want to fail import if any error comes in import process.

RunAsynchWithBatch : This gives you control to move execution on batch for long running query and big data file.

ThresholdToRunInBatch : This is again to optimize process to define threshold limit.

3. DataManagementDefinitionGroups-ImportFromDMTPackage.

This API was mostly used to Build DMT tool for AX 2009 to D365 Data migration.

This is Asynch API with defining specific batch group to optimize data import.

Based on each Batch group level you can control priority scheduling so you can manage your system resource to give high priority to specific files or interface.

Leave a comment

Design a site like this with WordPress.com
Get started