Data Management Concepts in D365 F&O: A Comprehensive Guide

Rumman Ansari   Software Engineer   2024-07-28 02:31:46   753  Share
Subject Syllabus DetailsSubject Details 2 Questions
☰ TContent
☰Fullscreen

Table of Content:

Data Management Concepts

The data management framework consists of the following concepts:

  • Data entities - A data entity is a conceptual abstraction and encapsulation of one or more underlying tables. A data entity represents a common data concept or functionality, for example, customers or vendors. Data entities are intended to be easily understood by users who are familiar with business concepts. After data entities are created, you can reuse them through the Excel add-in, use them to define import/export packages, or use them for integrations.

  • Data project - A data project is automatically created when you select Import or Export from the workspace and should have at least one job. It contains configured data entities, which include mapping and default processing options. A data project allows users to configure the entities that should be part of the project and defines the format that is used for each entity. It allows users to define the mapping that is used from the source file to the staging and specify the default processing options.

  • Data job - This is a job that contains an execution instance of the data project, uploaded files, schedule (recurrence), and processing options. A data job is used to perform the actual import or export operation. It contains the uploaded files, the schedule or recurrence information, and the processing options to use for the job.

A data job is created when the import or export operation is performed. The data job creates an instance of the data project and runs it. If you're doing an unplanned import or export, then there's typically only one job for each data project. If it is being done multiple times with different data, then you can use the same data project to import data multiple times by using different jobs.

Each data project can have one or more data jobs. For instance, if you're doing a single unplanned data import, then you might have one file that is being imported. In this case, there is a single data project that then has one data job.

Another scenario is that you might be importing data using the same data project multiple times, but with different data. In this case, there can be more than one data job for a single data project. Each data job can have one or more job histories.

For instance, a data job might be run multiple times after errors have been fixed. The job history tells you the details, such as the time taken to run the job, the number of records processed, and the errors during processing.

  • Job history- Histories of source to staging and staging to target jobs. Once a job has been run, you can see the job history, which contains the run history for each execution run of the data job and the history of the data move from source to staging and from staging to target.

The Job history tab in the Data management workspace shows all job histories for the import and export data projects. From Job history, you can view the run history for the source to staging and staging to target steps. Each project can have multiple jobs, which in turn have executions. By using the job history, you can view the execution details and determine the time it took to run the job, the number of records that were processed, and so on.

  • Data package - Data packages are key concepts for many application lifecycle management scenarios, such as copy configuration and data migration. A single compressed file that contains a data project manifest and data files. This is generated from a data job and is used for the importing or exporting of multiple files with the manifest. Once a data project is defined, including the data entities and the mapping and sequencing between these data entities, you can then create a data package. The data package can then be used to move the definition of the data project from one environment to another.

Users can generate a data package from a data job. To create a data package, go to the Data management workspace, load the project that you want to create the data package for, and then select Download. This generates a zip file.

Data Management Concepts - Download
Figure: Data Management Concepts - Download

The zip file contains the package header and the manifest. The manifest defines the settings of the data project. The data package can be used to copy the settings of your data project from one environment to another.

Data management platform

By using the data management framework, you can quickly migrate reference, master, and document data from legacy or external systems. It provides features to import data into a staging environment, perform basic data quality services or validation operations on this data, and allow you to validate and cleanse the data.

The data management platform also has features that allow you to map data from input to the target and do pre- and post-processing on data. If you export data, the source is Finance and Operations apps, and if you import data, the target is Finance and Operations apps.

The framework is intended to help you quickly migrate data by using the following features:

  • You can select only the entities you need to migrate.

  • If an import error occurs, you can skip selected records and choose to proceed with the import by using only the good data, opting to then fix and import the bad data later. You will be able to partially continue and use errors to quickly find bad data.

  • You can move data entities straight from one Finance and Operations apps system to another, without having to go through Excel or XML.

  • Data imports can be easily scheduled by using a batch, which offers flexibility when it is required to run. For example, you can migrate customer groups, customers, vendors, and other data entities in the system at any time.

The data management framework supports using data entities in the following core data management scenarios:

  • Data migration - You can migrate reference, master, and document data from legacy or external systems.

  • Set up and copy configurations - you can use this to copy configuration between companies or environments and configure processes or modules by using the Lifecycle Services (LCS) environment.

  • Integration - This is used where the real-time service-based integration is needed, or when you need an asynchronous integration. The real-time integration does not have a staging area and is directly processed by the services layer.

Data migration

This is an initial or unplanned data load that is performed manually by using the user interface. The scenarios where one uses this pattern could be when a functional user has some data in a source, such as an Excel workbook that needs to be imported from a legacy system to Finance and Operations apps during data migration.

Data flow

The first step is to load the file from the source to the central storage, such as Azure. Then, the data import/export framework picks up the data from the central storage and then pushes it into the staging tables of Finance and Operations apps. Then from staging, the data is moved to the target by using data entities that have been defined. This flow of data can be done either row-by-row or by using a set base of the data entities for the entire underlying tables for each data entity. The sequence and order in which the target tables will be populated can be controlled by using the sequence entity feature of Finance and Operations apps.

MCQ Available

There are 9 MCQs available for this topic.

9 MCQ