FieldOne Sky to Dynamics 365 for Field Service Migration Tool

Introduction

The Dynamics 365 for Field Service Migration Tool is provided to help customers using FieldOne Sky move to Dynamics 365 for Field Service. This tool assists with migrating solutions, mobile projects, and data required for Dynamics 365 for Field Service. Use of this tool in conjunction with the documentation provided will help speed up and simplify the migration process.

There are certain requirements to perform this process and to start with, the following 3 environments should be identified:

  • Solution Source: The solution source is the environment where unmanaged customizations exist that need to be migrated.
  • Data Source: The data source is the environment where data exists that needs to be migrated and may be the same environment as the solution source.
  • Target: The target is the environment where the upgraded solution and data will be migrated.

The following version requirements exist for source and target environments:

  • Source:
    • Dynamics CRM 7.1 or later
    • FieldOne Sky 5.1.0107.611 or greater
    • Resco MobileCRM Woodford 9.1.4.0You must upgrade Woodford to 9.1.4.0 and publish the mobile project to version 9.1 before migrating
  • Target:
    • Dynamics 365 for Customer Engagement Fall 2016 release or later

Once the CRM organizations are finalized, the process outlined below should be performed to do the migration:

This primarily includes:

  1. Uninstalling Field One Sky solution
  2. Installing Field Services solution
  3. Transform unmanaged CRM solutions from Source organization (or/and)
  4. Transform Resco Mobile projects from Source organization
  5. Transform and migrate all Data from Source to Target organization

It is important to uninstall Field One Sky solution, all of its dependencies, as in Field Services, all the schema names have been renamed, and some metadata as well as components are deprecated and not available in Field Services solution. Installation of Field Services would provide the required schema to migrate Field One Sky related customizations to Field Services

Solution and Mobile migrations help in transforming all the Schema customizations in the CRM organization from Field One Sky related metadata to Field Services. While preparing for solutions that have to be transformed and migrated via this tool, refer the other document "Dynamics 365 for Field Service Migration Guide"

Data Migration is to transform and migrate data from all entities from the Source CRM Organization to Target CRM Organization.

Custom code such as plugins, workflow assemblies, JavaScript, integrations, and portals will need to be reviewed by a developer to ensure these components are utilizing the latest schema from Dynamics 365 for Field Service. It is recommended to perform a user acceptance-testing event to ensure all functionality is working prior to go live.

Field One Sky Migration Tool Tour

To start using the Migration tool, execute the Dynamics 365 for Field Service Migration Tool executable (Dynamics 365 for Field Service Migration Tool.exe) available in the tool release kit. The following tool UI would open:

Figure 1 Dynamics 365 for Field Service Migration Tool

Tool Ribbon and Tabular Items

Most of the UI options are self-explanatory and below is some quick information about each option in the chronological order these should be performed, and detailed information about these options are available in next sections:

  • Select(/Change): This is the first step to be performed before we start on migration activities. By clicking Select you have to idetify a folder location(referred as 'Project') where all logs and migration cache would be saved. It is important to use the same folder per migration as all project related configurations are stored under this directory.
  • Connect(/Change) Source: Connect to Source organization or Change the old source organization.
  • Connect(/Change) Target: Connect to Target organization or Change the old Target organization.
  • Manage Target Organization UI: Once CRM connections are established, target organization has to be analyzed. If Business Processes, Global Audit, duplicate detections or state transition rules are enabled, they have to be disabled/deactivated first as these can cause issue with migration process. Also Field One Sky and Field Service version related information is available on this UI.

Figure 2 Manage Target Organization layout

  • Uninstall Field One Sky solution and dependencies: Field One Sky solution and all its managed and unmanaged dependencies must be removed first to have a clean environment for Field Service solution install.

Figure 3 Uninstall Dependencies layout

  • Install Field Service: Field Service solution zip file is already part of this tool release. By clicking on this button, this version of Field Service would install on Target CRM org
  • Create(/Update) Maps: Once we have a source CRM Organization with Field One Sky solution installed and Target CRM organization with field Service solution installed, mapping must be generated which will cover all the schema name changes as well as any metadata deprecations between Field One Sky and Field service solution
  • Transform Solutions: Any unmanaged custom solutions related to Field One Sky should be transformed and imported to Target CRM Org. The transformed schemas are in compliance with Field Service solution metadata and should get imported in Target org.

Figure 4 Transform Solutions layout

  • Transform Mobile Projects: Once solution schemas are transformed and imported to Target CRM org, all Resco mobile projects can be transformed and imported. It is an optional step only if your organization has Resco mobile projects.

Figure 5 Transform Mobile Project layout.

  • Migrate Data: This would migrate all the Data from Source organization to Target organization. It is important to perform this step as with Field One Sky solution uninstall, data also gets deleted. This step would bring all the data from Source Org to Target Org.

Figure 6 Migrate Data layout

  • Post Data Migration Tasks: This would run those required business logic which would usually be done by plugins and workflows. It is necessary to run these tasks because plugins and workflows are disabled while migrating data from Source org to Target Org. It is important to note that post data migration tasks should be run only once after data migration is completed and finalized. Else, multiple runs would introduce inconsistent and duplicate data.

Figure 7 Post Data Migration layout.

Menu Items

Menu item configurations hold importance in performing migration activity. Below is some high-level information on this:

Below is the description of menu items - File, Edit, View, Help.

File: On clicking File menu the below configurations can be performed

Figure 8 File options

Setting

Description

Open Directory

Open present working directory.

Open Recent

List of recently used working directories. This can be used to easily switch between projects. Note that when changing project directories, you'll have to reconnect to source and target.

Close

Close the existing Working directory.

Exit

Exit from the tool.

Figure 9 File options

Edit: On clicking the Edit menu the following options are available

Figure 10 Edit options

  • Project Settings/ Project Advance Settings: These are project level configurable settings to handle Data Migration effectively. All configurations are explained in detail in next sections of this document.
  • Options: Log Level is selected from the list - LogAlways, Critical, Error, Waring, Informational, Verbose.
  • View: On clicking the View menu the following options are available

Figure 11 View options

Setting

Description

Working Directory

Open present working directory.

Working Data Cache

Opens the present data cache.

Working Data Logs

Opens the present data logs (for data migration).

App Logs

Opens the present app logs. These logs have all the information related to establishing CRM connections and performing Solution and mobile migrations

Figure 12 View options description.

Help:

Figure 13 Help options

Setting

Description

About

Information about Field Service Migration Tool.

Documentation

Opens the Tool Documentation and links to access the documents.

Data Migration Settings

Data Migration Configuration

Setting

Default Value

Purpose

Override CreatedOn

Enabled

When this option is enabled, for record field createdon will hold the same value as there in source organization record instead of current timestamp. This will help in maintaining the info of createdon in the target environment.

Override Owner

Enabled

When this option is enabled, for record field owner will hold the same value as there in source organization record instead of user performing the migration activity. This will help in maintaining the info of owner in the target environment.

Retry Count

2

This is the number of times to attempt migrating data during the data migration process. Minimum One Retry attempt is required to migrate the data. In cases where the data contains complex relationships, it can be beneficial to increase the retry attempts as an additional pass might be required to complete the data migration process. In that case, the value can be updated to 2.

Clear Cache

Disabled

During the data migration process, a cache is maintained that tracks the data migration process, which is helpful in case of failures. When this option is enabled Cache is not maintained in the Project directory.

Figure 14 Data Migration Settings

Configuration for multiple runs

Following configuration needs to be considered each time you run the data migration.

Setting

Default Value

Purpose

Migrate modified records only

Disabled

Migrate data that has been modified on or after the date specified and/or modified on or before the date specified. If modified on or before date is kept blank, it will default to current date. This can be useful for keeping a live production environment's data in sync with the target environment during the migration process.

Resume from the previous data migration

Disabled

Continue the data migration process from the previous run. During the data migration process, a cache is maintained that tracks the data migration process. If the process is interrupted, the data migration can resume from the cache. This feature would not work if you have disable the Cache setting.

Ignore Fatal Exceptions and Continue to Next Iteration

Enabled

This allows the data migration process to continue with next iterations even if current iteration had some fatal exception for a given entity.

Figure 15 Data Migration configuration for multi-run.

Performance Configuration

Setting

Default Value

Purpose

Job Max Threads

4

Controls the degree of parallelization for steps being processed with in a job (i.e. how many entity steps will be processed simultaneously).  Setting this to 1 can be helpful for debugging purposes.  Be careful to not increase this setting too high since each step also parallelizes operations.  This could actually lead to a performance degradation as the process becomes starved for resources.  Generally we've seen a setting between 4 – 8 be most effective on average client processor hardware.

Service Connection Limit

1000

Increases the max remote connection limit on the ServicePoint for the source/target organization endpoints.  .NET default for each endpoint/process is 2.  If this value isn't increased, it will throttle the degree of parallelization at the client transport layer.  Reference ServicePoint.ConnectionLimit.

Service Connection Timeout

1200000

Allows increase of default timeout duration (2 minutes) on the client channel.  Default value is the millisecond (ms) equivalent of 20 minutes. Reference ServiceProxy<TService>Timeout.

Service Expect100Continue

False

False = disables default 100-Continue transport behavior on the service point manager for the source/target organization endpoints.  True = enables 100-Continue transport behavior on those endpoints.  Disabling this behavior is considered an optimization because the client will send the entire payload in a single round-trip rather than first sending a partial payload to verify the connection before sending remaining payload, resulting in multiple round-trips.  The latter is beneficial for client/server scenarios with a pessimistic approach where requests are often expected to be rejected.  Reference ServicePoint.Expect100Continue.

Service UseNagleAlgorithm

False

False = disables default use of Nagle algorithm to avoid unnecessary client packet buffering for a process with inherently high request volumes and potentially large request payloads. True = enables use of Nagle algorithm. Reference ServicePoint.UseNagleAlgorithm.

Step Execute Multiple Size

10

The number of requests batched together in a single ExecuteMultipleRequest.  Sending requests in batches reduces the number of round-trips between client/server thus lowering the cumulative latency penalty and also avoids reaching API limits.  DO NOT increase this value above 10 unless your organization has had the maximum allowed concurrent batch request setting increased.  Otherwise, your organization will be throttled to 2 concurrent requests and receive a "Server Busy" exception.  Reference run-time limitations.  Any value greater than zero will cause requests to be batched.  Values equal to (or less than) zero will revert to parallelization of individual organization requests.

Step Max Records to Retrieve

15000

Controls the maximum number of rows to be returned from the source organization when retrieving data during a migration job.  When this setting is higher than the "Step Records Per Page" setting, it allows the process to fetch more data than is possible via a single Dynamics 365 query request which can be beneficial by reducing the overhead of parallelizing smaller datasets.  For example, if the max records to retrieve is set to 15000 and the records per page setting is 5000, then three query requests to fetch sequential pages of data will be performed prior to transformation and submission of the parallelized requests to the target organization.  Note: higher values mean more entity data will be held in memory and could lead to an out-of-memory exception.  Also, progress state is saved after submitting data to target thus, higher values means more progress could be lost in-flight if a migration is cancelled/terminated prematurely.

Step Records Per Page

5000

Controls the page size of rows to be returned from the source organization by the query expression when retrieving data during a migration job.  Max value is 5000.

Step Max Threads

8

Not currently used.  This setting was intended to control the degree of parallelism within an entity step for individual data requests being submitted to the target organization.  Currently each entity step uses unconstrained parallelism allowing the .NET framework to manage partitioning and use of the thread pool.

Step Throttled  Max Records to Retrieve

500

Same as "Step Max Records to Retrieve" setting, but applied as a throttle to the post-stage migration entities such as annotations/attachments which are expected to have a larger payload size.  The throttled value is a defensive measure to avoid out-of-memory exceptions and allow progress to be saved more often due to the longer transmit/process duration of larger payloads.

Step Max Records Per Page

10

Same as "Step Max Records Per Page" setting, but applied as a throttle to the post-stage migration entities such as annotations/attachments which are expected to have a larger payload size.  The throttled value is a defensive measure against exceeding response length limits bound to the service endpoint and/or client channel.  For example, consider a single attachment record could be as large as 32mb.

Figure 16 Data Migration Performance Configuration

Logging

App allows both flat file and SQL logging.

File Logging

Below two logs are very useful during different migration phases.

  • App Logs These logs have all the information related to establishing CRM connections; uninstall Field One Sky solutions and its dependencies, Installing Field Services, generating Maps, Solution and mobile project transformation.
  • Working Data Logs These logs have all the information related to Data Migration phase.
  • Navigation to Logs As shown in the picture, navigate from the View menu to Working Data Logs and App logs for the corresponding Data Migration and App logs.

Figure 17 Navigation from View menu to logs

Configuring Logging Level As shown in the below picture, different levels of logging can be configured in Options under Edit menu.

Figure 18 Configuring logging level.

SQL Logging

SQL Logging can be configured and enabled for Data Migration. This is an optional logging and by default it is disabled. SQL logging logs all the Warning and Error messages as logged in text log file located under "Working Log Data" folder. The purpose of this logging is to give user the option to store Data Migration logs and retrieve all the error messages by using different SQL queries. It is easier to fetch all the error messages for a particular entity via SQL query than browsing through different set of file based logging messages.

SQL logging can be enabled and configured in SQL Connection settings under Project settings in Edit menu. By default, SQL logging option is disabled with the default values of local database.

Database instance can be set up in two different ways,

  • Using the Logging.mdf file in SQLLogging folder, available with release. This creates the local database instance in the machine.
  • User can set up their own Database instance and provide the same connection details for logging.

SETTING UP DATABASE USING logging.mdf

  • Navigate to the Server Explorer under View menu in Visual Studio, if installed else, it would be an install of SQLLocalDB.msi.
  • Right click on Data Connections and click 'Add Connection' to add a new database connection.
  • Browse the Database file available under <Migration Tool Release Folder>\SQLLogging folder.

Figure 19 Configuring database using logging.mdf

Click on Test Connection to ensure the connection is established successfully.

SETTING UP YOUR OWN DATABASE

  • Navigate to SQL Server Object Explorer in Visual Studio if installed, else it would be an install of SQLLocalDB.msi
  • Right click on SQL Server to add new SQL Server, connect to (localdb)\MSSQLLocalDB server.
  • To add a new Database, right click on Server click and click on new query.
  • Copy and paste the CreateSemanticLoggingDatabase.sql query from SQLLogging folder in the new query window.
  • By default, on execution of this query, creates a database with 'Logging' name.
  • To create the database with different name, replace the 'Logging' with the new name in query.
  • Click on refresh to see the results.
  • Right click on the new database created and click new query to create a table in database.
  • Copy and paste the CreateSemanticLoggingDatabaseObjects.sql query from SQLLogging folder. By default, this creates a table with 'Traces' name.
  • To create the table with a different name, replace the 'Traces' with the new name in query.
  • On execution, this creates the table. Refresh to see the results.

CONFIGURING SQL LOGGING

  • Navigate to SQL Connection Settings under Project settings in Edit menu.
  • By default, SQL Logging is enabled with the values default to local database (created using logging.mdf).
  • Provide the SQL server name in the SQL Instance Name.
  • Right click on the server and click on properties to get the Connection String.
  • Enter the name of the table created in the database in Table Name and save the changes.

Figure 20 Configuring SQL Logging.

ANALYSING LOGS

After completing the migration, to view the generated logs, right clock on the table name and click on Show Table Data

Figure 21 Traces Table

Logging table sample

Figure 22 Logging Table

Logs stored in the table captures the following details:

Column

Description

Id

A unique id for this log message. This is an identity column in the table.

InstanceName

The name of the instance that logged the message.

ProviderId

A unique identifier for the event source that logged the message.

ProviderName

The friendly name of the event source that logged the message.

EventId

A unique identifier for this event type. This id is specified using the EventId property of the Event attribute that decorates the log method in your custom event source.

EventKeywords

An integer value that represents the value of the Keywords property of the Event attribute that decorates the log method in your custom event source. If multiple Keywords are assigned to a log method, this value represents the result of an OR of the keyword values.

Level

The integer value of the Level property of the Event attribute that decorates the log method in your custom event source. See EventLevel Enumeration on MSDN.

Opcode

The value of the Opcode property of the Event attribute that decorates the log method in your custom event source. Valid values are defined by the EventOpcode enumeration.

Task

A task identifier. You can optionally assign task identifiers to the log messages in your custom event source by using the Task property of the Event attribute that decorates the log method in your event source.

Timestamp

A timestamp that records when the log message was written. The timestamp includes a UTC offset value.

Version

A version number. You can optionally assign a version number to the log messages in your custom event source by using the Version property of the Event attribute that decorates the log method in your event source.

ProcessId

The identifier assigned to the process where the event was raised. This value is a nullable Integer. It is available only when the application is running in full trust mode.

ThreadId

The identifier assigned to the thread that raised this event. This value is a nullable Integer. It is available only when the application is running in full trust mode.

ActivityId

A GUID identifier for the current activity in the process for which the event is involved.

RelatedActivityId

A GUID identifier for a related activity in a different process that is related to the current process.

FormattedMessage

This is the formatted message written to the log. It is the value of the Message property of the Event attribute that decorates your log messages in your custom event source, with the placeholders replaced by values from the event payload.

Payload

The parameter values passed to the custom log method in your custom event source. These are displayed in JSON format. Any custom log methods that have no parameters will show "{}".

Figure 23 Logging Table Attributes

Querying Logging Table The generated can be further queried down to have a specific information. To add a new query, under Tables, right click on the table name and click on New Query

Sample query 1: Select * from Traces1 where Payload like'%entityName%' and Payload like '%systemuserroles%'

Figure 24 Query result for query 1

The above query can provide all the Warning/Error log messages for "systemuserroles" entity

Sample query 2 Select * from Traces1 where Payload like'%entityName%' and Payload like '%systemuserroles%' and Level = 3

Figure 25 Query result for query 2

  • The above query can provide all the Error(Level = 3) log messages for "systemuserroles" entity

NOTE: If you have multiple projects configured on the same machine and want to enable SQL logging for each project, it is advised to create a separate Database instance per project and then in Project configuration (Edit à Project Settings) you can provide the Database Name as well as Connection string details.

Migration Process

Below sections, explain each step to be performed during migration in detail:

Project Management

A Project is a customer specific directory location, where all the configurations related to migration from Field One Sky to Dynamics 365 for Field Service is stored. On selecting the project folder, migration tool would create a repository with set of files and folders:

Figure 26 Project Management Directory

Creating a separate Project directory for each Customer helps to maintain all the migration related information such as Migrated Solutions, Data Migration Cache, Project Settings and Logs at one place and helps the tool to continue from the last processed migration step. Hence, it is very important to use the same project directory if you must run the migration tool multiple times to migrate the same customer environment. You must consider multiple environments such as DEV, TEST, PPE, PROD as different environments and a separate project should be created for each environment.

Connection Management

Source Organization

The source organization for solutions and mobile migration should be a development environment that contains the Field One Sky solution and all unmanaged customizations.

The source organization for data migration should be a production environment. The source organizations for both solutions and data must have the same schema. To ensure the schema is synchronized, it is recommended that ongoing development efforts are finalized, and a code freeze is put in place before starting the migration project.

Connect Source

Connects to the Source CRM organization. Green light highlights when the connection is established successfully. The User trying to connect to the Organization should have the Read Privilege to solutions in the CRM Organization. (prvReadSolution Privilege)

Figure 27 Connect to Source Dialog.

Figure 28

Once the connection is established to Source CRM Organization, you can still change the Organization by clicking on the button to change Source CRM Organization. You should Update Maps on changing Source Organization. Also, changing source or target orgs for a project will clear out previous task results stored in project file and you would not be able to see the migration results from previous attempt

  1. Target Organization

Target Organization is the environment where the upgraded solution and data will be migrated.

Connect Target: Connect to the Target CRM organization. Green light highlights when the connection is established successfully. The User trying to connect to the Organization should have the Read Privilege to solutions in the CRM Org. (prvReadSolution Privilege)

Figure 29 Connect to target dialog.

Figure 30

Once the connection is established to Target CRM Organization, you can still change the Organization by clicking on the button to change Target CRM Organization. You should Update Maps on changing Target Organization. Also, changing source or target orgs for a project will clear out previous task results stored in project file and you would not be able to see the migration results from previous attempt

STEPS TO CONNECT

  • First, ensure that deployment type of your CRM application is Online.
  • If you are using a CRM Online instance, you will need to determine Online Region. Choose the Online Region from the below dropdown depending on your Discovery Web service URL
  • After you choose the region for your CRM instance, enter in your Office 365 user account and password that you use to login to CRM Online.

Figure 31 Steps to connect

  • Select the Organization you would like to connect to from the list of available organizations.

Figure 32 CRM Organizations

NOTE: When the working directory is not changed and If Display list of available organizations is unchecked user will be connected to the previously connected organization.

Analyze Target

This performs analysis on the connected target Dynamics 365 organization. It helps to check for the activated processes and few other enabled settings, which may interfere with migration activities.

Migration activities require analysis of target to be performed as pre-requisite.

Figure 33 Analyze Target

The following features are covered under this step:

  • Plugins and Workflows are the custom business logic implemented to enhance or modify the standard features/behavior of CRM. These business processes should be deactivated to avoid interference during migration activity and activated once the migration is completed. It should be assumed that the data being migrated from source is already in a desired state, thus plugin/workflow logic may at best be redundant and at worst cause data integrity issues.
  • Auditing feature allows to track changes made to data in CRM. It also tracks each time a user logs into the system. If auditing is enabled, CRM automatically creates logs for the changes that are tracked. For migration purposes, Global Auditing should be disabled
  • Duplicate detection services allow you to create rules that check for duplicates on records so that you can decide what records to keep, merge or delete. For migration purposes, this feature should be disabled, and you can enable this on the completion of migration activity.
  • State Transition Rules are an optional additional level of filtering to define what the status reason value can be changed to for each status reason. Defining a limited list of valid options can make it easier for people to choose the correct next status reason for a record when you have many combinations for valid status reason values. If any such rule is detected, it must be disabled first, before moving ahead with the migration work. State transition rules can only be modified in the Dynamics 365 web app administrative area. They cannot be modified via the tool directly.

NOTE: Any of the above actions performed under Analyze section are saved in your project home folder in file named processactivations.json. The tool has the functionality to enable/reactivate Business Processes, Global Audit, duplicate detections that has been disabled/deactivated via migration tool and saved in processactivations.json

Uninstall Dependencies

Uninstall Field One Sky from Target CRM Org:

Figure 34 Uninstall Dependencies

To uninstall Field One Sky solution from the target, all the dependencies, managed as well as unmanaged components/solutions of FieldOneSky_patch, FieldOneSky_base, FieldOneSky should be uninstalled.

If the 'Uninstall managed solution dependencies' checkbox in 'Uninstall Dependencies' window is checked, it will uninstall all the dependencies else it will only list the managed dependencies without uninstalling them and uninstall all unmanaged dependencies.

The 'Uninstall' button is enabled only if the target organization has Field One Sky solution installed in it.

After all Field One Sky solutions dependencies are handled, only FieldOneSky_patch, FieldOneSky_base solutions would be uninstalled via this tool, but Field One Sky solution should be manually uninstalled from target CRM UI.

Once Field One Sky is uninstalled manually, click on the Analyze button to refresh the results.

If for some reason, all the solutions must be uninstalled manually, then the following order must be followed:

  • FieldOneSky_patch
  • FieldOneSky_base
  • Field One Sky

The following component type dependencies are taken care of by tooling uninstall feature:

  • SystemForm
  • SavedQuery
  • Attribute
  • EntityRelationship
  • OptionSet
  • SdkMessageProcessingStep
  • SdkMessageProcessingStepImageWorkflow
  • Report
  • EmailTemplate
  • KbArticleTemplate
  • MailMergeTemplate
  • ContractTemplate
  • ConnectionRole
  • ServiceEndpoint

For any other component type, check the application logs and see the cause of uninstall failure. Log message for any unsupported component type is:

Component type <component type>" not supported for deletion. DependentId={<Dependent GUID value>}, RequiredId={={<Required GUID value>}, RequiredType={<Required Type>}

You must manually uninstall the component via CRM UI with DependentId, and then run the tooling again for uninstall feature.

Install Field Service Solution

Once Field One Sky solution is completely uninstalled from the target CRM Organization, this action installs Dynamics 365 for Field Service v6.0 solution in the target CRM organization.

Figure 35 Install Field Service Solution

Create Maps

Custom Metadata Mapping file consists of mappings from Field One Sky to Field Service, as well as mappings of any custom schema as well as system schema that is read from the specified Source environment. This mapping file is used for migrating solutions, mobile projects, and data.

On click of this button, it connects to the Source and Target organizations and generates Custom Metadata Mapping file in the Project Directory.

If mapping file already exists in the Project Directory, it is displayed as 'Update Maps', which updates the existing one. Couple of scenarios where you need to perform Update Maps are:

  • Source/Target CRM Organization is changed
  • Any schema change is performed in Source CRM Organization

Figure 36 Create Maps

Transform Solutions

The following transformation has to be performed:

Solution Transformation

Figure 37 Solution Transformation

The tool assists in reading unmanaged CRM solutions from the Source Organizations and transforming them to the content that is compatible with Dynamics 365 for Field Service Migration schema.

Source solutions can be selected in the tool in two different ways,

  • 'Load' button fetches all the unmanaged solutions from the Source organization.
  • 'Add File' button helps to browse the unmanaged solution file that might not be in the Source CRM Organization and must be transformed.

Multiple solutions can be selected by clicking on 'Load Button' or 'Add File' option, and selecting appropriate solutions

'Transform' button helps to migrate the selected solutions, and the progress can be viewed under 'Solution Transform Status' window. This window will preload the details of last transformed solution in that project.

After successful migration, the transformed solutions are located at Project directory, which can be navigated directly with the help of 'View Working Directory' hyperlink. The transformed solutions should be imported manually in the target organization.

Deprecated References will show up all the references to the deprecated components of Workflows, Forms, Views and Charts (System Form, Saved Query and Saved Query Visualization) of the solution. On import of that Solution in the target, the listed references may throw an exception. Even though the deprecated references are handled via Migration tool, there is a possibility that you may not observe any issue while importing transformed solutions. In general, transformed solutions with Workflows and Charts components have higher chances of reporting issues and may require removing deprecated references in source solution.

Mobile Solution Transformation

The tool helps to transform the Resco Mobile projects from the Source organization.

'Transform' button helps to migrate the selected solutions, and the progress can be viewed under 'Solution Transform Status' window. This window will preload the details of last transformed solution in that project.

After successful migration, the transformed solutions are located at Project directory, which can be navigated directly with the help of 'View Working Directory' hyperlink.

The transformed solutions should be imported manually in the target organization.

Figure 38 Transform Mobile Solutions

Data Migration

Tool performs data migration from Source Org to Target Org in multiple phases. First phase is to migrate as much data as you can from Source to Target Org. On subsequent attempts to migrate data, the following two configurations would come very handy:

  • Resume: This should be used if, while performing the previous data migration run, the migration was interrupted. The interruption could be intentional by clicking the Cancel button in Data Migration screen when the data migration is in progress and/or any fatal error during data migration. Selecting Resume will essentially start the migration from where previous run left off using the cached progress state.
  • Migrate modified records On/After date: This option should be used when the first phase of data migration is completed, and you want to migrate only records that were created/updated after you started the data migration previous phase. You should select a date prior to you started the last data migration phase. This configuration can be used alongwith Resume configuration to resume data migration from the point it was interrupted during the last run.
  • Migrate modified records On/Before date: This option can be used along with Migrate modified records On/After date to provide a range of date. Data Migration would be attempted on records meeting this date range criteria.

Figure 39 Data Migration

Data Migration is divided in two steps:

Load Entities

This step involves loading all the entities definitions, user mappings, optionsets metadata and other information required to perform data migration. By clicking the Load entities button, all the entities and relationships with corresponding record count present in the source organization would be displayed. Entities with no data will not be displayed in the Entities list. This screen is dock-able from the main UI and can be expanded as per your convenience.

On clicking the Load Entities button, existing data will be counted for all entities and relationships, which could take time to complete. On subsequent runs, if resuming, the counts come in via the cached state.

Figure 40 Load Entities Table

In the above table, following details are captured.

Status

Status of the load entity operation

IsIncluded

The first column with checkbox defines if the corresponding entity is considered for data migration or not. If you do not want to perform data migration on any particular entity, the checkbox should be unchecked. Also, this can be helpful to re-process the migration for a single entity at a time.

Entity Name

Name of the Entity

Type

Shows the type, Entity or Relationship

Stage

The Data Migration classifies the entities into stages and processes the Stages in the below order:

  • Security
  • Main
  • Post Entities

In each stage, entities are further classified based on priority.

Priority

Entities are processed in the stages based on the priority, which can be configured for each entity. The default value is calculated during creation of Metadata maps from CRM Organizations. This is an editable field and priority for the entity can be configured based on your understanding of the environment. If you want a particular entity to be processed after another entity, you can change the priority. Once you change the priority, the value would be preserved in Project configuration and the same value would be used for subsequent migrations.

Changes Only

When checked to migrate modified records on or after a specified date, this value is 'Yes'

Iterations

Number of times the entity data migration occurs for a record.

Records

Total number of records for the entity which are to be migrated.

Requests

It the total number of requests made (Iterations * requests).

FetchXml

FetchXml helps to select the set of records of the entity which are to be migrated in the Data Migration job. This is an optional and editable field. Once the Load Entity results are displayed, you can add value in this field and this value would be used in subsequent migrations. This value is stored in project cache.

This is an example of one Fetch XML that fetches the account entity records with address1_city value Hyderabad. So, only the Account records which has city 'Hyderabad' will participate in Data Migration but not all the records:

<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false"> <entity name="account"> <attribute name="name" /> <attribute name="primarycontactid" /> <attribute name="telephone1" /> <attribute name="accountid" /> <order attribute="name" descending="false" /> <filter type="and"> <condition attribute="address1_city" operator="like" value="%Hyderabad%" /> </filter> </entity></fetch>

Figure 41 Loaded Entities Properties

After all the entities are loaded, the consolidated result of the Load Entities operation is presented to the right of the Migration Entities tab. Below is an example of DataMigration Table:

Figure 42

Iteration Results

This window will preload the last migration job result of that project. Entities selected in the Migration entities job will start processing on click of 'Start Migration Job' button.

Each entity iteration result will be displayed in the grid and the consolidated results are displayed to the right, after migration job.

Figure 43 Iteration Results

Column headers are draggable to reorder, sortable, and when dragged into header, it will apply a grouping. Groups can also be collapsed/expanded. This can help to visualize the progress and results when many entities are being migrated.

Also, the grid can copy the items into the clipboard for pasting into Excel by selecting rows and using ctrl+c. This can be helpful if you want to paste the results in an Excel to view it later or for your record purpose

For any errors/warnings that occurred during the migration job, review the Working Data logs folder under View menu.

In case of Fatal Error, the complete Data Migration would halt and checking the Data logs would give the possible reason of the failure. "Fatal Error" is different from any error/warning that we see while migrating a single entity record. "Fatal error" would halt the complete data migration process. We can always resume data migration process from the point the data migration was interrupted either by Fatal error or if you have Cancelled the last data migration run by clicking Cancel button on the screen. Clicking Cancel button would cancel the data migration and if you have enabled caching in data migration configuration, you can always resume from cache.

If you enable Ignore Fatal Error and Continue To Next Iteration option in data migration configuration, it allows the data migration process to continue with next iterations even if current iteration had some Fatal error for a given entity.

Post Data Migration Tasks

Before data migration, Business Processes, Global Audit, Duplicate Detections or State Transition Rules are analyzed and disabled on the Target Org. Because of this, some of the functions/operations which are supposed to be carried out during record create/update operations of Data Migration Phase are not performed. These necessary operations are performed in Post Data Migration Tasks.

Post Data Migration Tasks are enabled only after at least one successful data migration phase. Running Post Data Migration Tasks multiple times may result in inconsistent and duplicate data on the Target Org. So it is advised to run Post Data Migration Tasks only once after data migration is complete and final.

Implementation Details

Connection Management

Connect to the Source and Target CRM organizations. Green light highlights when the connection is established successfully. User must choose Online (Office 365) CRM organizations. The User trying to connect to the Organization should have the Read Privilege to solutions in the CRM Organization. (prvReadSolution Privilege). Once the connection is established to CRM Organizations, you can still change the Organization by clicking on the button to change corresponding CRM Organizations.

Connection options

First, ensure that deployment type of your CRM application is Online. If you are using a CRM Online instance, you will need to determine Online Region. Choose the Online Region from the below dropdown depending on your Discovery Web service URL

The following table lists the Web service URLs for the worldwide Microsoft Dynamics 365 (online) data centers.

Figure 44 CRM Webservice URLs

After you choose the region for your CRM instance, enter in your Office 365 user account and password that you use to login to CRM Online.

Figure 45 CRM Region

Select the Organization you would like to connect to from the list of available organizations.

Note: If Display list of available organizations is unchecked user will be connected to the previously connected organization, if any.

Figure 46 CRM Organizations

Troubleshooting

If you are unable to connect to Dynamics CRM with a new connection or if an existing connection is now failing, there are some troubleshooting steps that you can take:

Plug-in Registration Tool – You can download the Dynamics CRM SDK and use the Plug-in Registration Tool to test connectivity through the CRM API to your Online instance.

  • The User trying to connect to the Organization should have the Read Privilege to solutions in the CRM Organization. (prvReadSolution Privilege). Check whether user have this privilege.
  • Check for username or password or combination of both. Password is case sensitive.

Uninstall Dependencies

Solution Dependency

A dependency record contains data about how a solution component depends on another solution component. A solution component cannot be deleted while another solution component depends on it.

Solutions are made of solution components. You'll use the Solutions area in Microsoft Dynamics 365 to create or add solution components. You can perform these actions programmatically by using messages that create or update solution components that include a SolutionUniqueName parameter.

Solution components often depend on other solution components. You can't delete any solution component that has dependencies on another solution component. For example, a customized ribbon typically requires image or script web resources to display icons and perform actions using scripts. If the customized ribbon is in the solution, the specific web resources it uses are required. Before you can delete the web resources you must remove references to them in the customized ribbon.

Calculating Solution Dependency

The tool fetches the component dependencies that prevent Field One Sky solution from being uninstalled and it deletes/modifies unmanaged components and optionally uninstalls managed solutions that contain dependent components.

If the dependent component is either a Form or View, tool will remove the field by which the component is listed as dependent from that dependent component.

Uninstalling dependency

  • First, it tracks all the managed dependencies.
  • Then, deletes the Workflow components by first deleting the dependencies of it (if any) and then the component itself.
  • Next, deletes all other unmanaged component types by first deleting the dependencies of it (if any) and then the component itself.
  • After unmanaged dependencies are removed, uninstalls any related managed solutions including unmanaged/managed dependencies of those solutions. Few managed dependencies like System Form and System Views are not deleted, and only dependent components are removed from these components.

Known issues

Missing Dependency:

For missing dependencies in solution.xml, there are many components (forms, saved queries and charts) that were there in Field One Sky but deprecated in Field Service. When we prepare the solution that must be transformed, if there are not any changes in these components while creating solution, CRM assumes that these components would be available in Target Organization. Whereas these components are deprecated in Target Organization and would not be available. The deprecated components/references are listed at the end of solution transformation. It is suggested to the user to update the solution by removing the deprecated forms, saved queries and charts from the solution to avoid MissingDependency issue.

PFB the snapshot for one such transformation.

Figure 47

InArgument tag:

There will be multiple formats under 'InArgument' tag that may have occurrence of 'f1_' (field service) in a solution. Only  .EntityReference and .OptionSetValue are taken care by code ,for anything of String type under WorkflowPropertyType, that has to be fixed manually in the transformed solution.

Mapping

Mapping associates the source component to its corresponding target component.

F1 to FS mapping contains the entity mappings of Field One Sky and Field Service solutions. Custom Metadata Mapping file consists of mappings from Field One Sky to Field Service, as well as mappings of any custom schema as well as system schema that is read from the specified Source environment. Each entity in Custom Metadata Mapping file has mapping information of entity properties, attributes, OneToMany relationships and ManyToMany relationships. This mapping file is used for migrating solutions, mobile projects, and data.

To generate maps, tool combines path to the core entity metadata mapping, OptionSet metadata mapping and webresource metadata mapping file in the current temporary directory.

All metadata maps repositories implement IFileRepository and derive from FileRepository.

Mapping Computation

It computes the mappings from Field One Sky to Field Service, mappings of any custom schema and system schema that is read from the specified Source environment and saves into Custom Metadata Mapping file. Custom entity metadata mappings are generated based on CRM organization metadata and provided core mappings.

For Example: To create attribute mappings

  • List of attributes that should never be used during metadata mapping are prepared. These are the excluded attributes for the entity.
  • Source organization metadata is extended to the core entity metadata mappings to produce a union set of mappings.
  • Collects composite set of attribute mappings for the entity (core, exist in source, non-excluded, and custom)
  • Removes attributes metadata mappings from Custom Metadata mappings for the core attributes not available in Source.
  • Finds list of deprecated attributes, overrides using old logical name.
  • Adds custom map to the composite collection of maps.
  • Filter out any remaining core attribute maps not found in source metadata.
  • Checks for the excluded attributes in the attribute metadata mappings for the entity.
  • Reset mappings to list enriched with source/target metadata and without excluded entities.

Update Maps

If mapping file already exists in the Project Directory, it is displayed as 'Update Maps', which updates the existing one.

The existing Custom Metadata Mapping file is updated to keep the environment's data in sync. Also, the mapping file should be updated when the Deprecated entities are added/updated in DeprecatedExceptionMetadataMappings.json.

Difference between F1 to FS mapping and Custom Metadata Mapping

F1 to FS mapping contains the entity mappings of Field One Sky and Field Service solutions whereas Custom Metadata Mapping holds the mappings of any custom schema and system schema that is read from the specified Source environment, in addition to F1 to FS mappings.

Metadata Maps Helper Tools

The target environments will be upgraded to CRM 8.2 and Field Service 6.2. During the migration process there will be a minimum of two target CRM environments created. To complete the Field Service migration two available Sandbox instances will be required. Please contact support or your Microsoft account team if you require additional Sandbox instances for the tenant.

  • You can write LINQ queries in LINQpad to check or modify the mappings.

To download, click here https://www.linqpad.net/Download.aspx.

Sample Query: Click Below for a sample query to retrieve opionset mappings: UpdateOptionSetMapping.linq


  C:\Users\v-saimta\Source\Repos\Rangers-F1Migration\Src\bin\UnitTest\Microsoft.Dynamics365.Migration.Solution.dll
  C:\Users\v-saimta\Source\Repos\Rangers-F1Migration\Src\bin\UnitTest\Microsoft.Pfe.Xrm.Core.dll
  C:\Users\v-saimta\Source\Repos\Rangers-F1Migration\Src\bin\UnitTest\Microsoft.Xrm.Sdk.dll
  C:\Users\v-saimta\Source\Repos\Rangers-F1Migration\Src\bin\UnitTest\Microsoft.Xrm.Tooling.Connector.dll
  C:\Users\v-saimta\Source\Repos\Rangers-F1Migration\Src\bin\UnitTest\Newtonsoft.Json.dll
  Microsoft.Dynamics365.Migration
  Microsoft.Dynamics365.Migration
  Microsoft.Dynamics365.Migration.Solution
  Microsoft.Dynamics365.Migration.Solution
  Microsoft.Dynamics365.Migration.Solution.MetadataMapping
  Microsoft.Dynamics365.Migration.Solution.MetadataMapping
  Microsoft.Dynamics365.Migration.Solution.XmlParser.CRM
  Microsoft.Dynamics365.Migration.Solution.XmlParser.CRM
  Microsoft.Xrm.Sdk.Messages
  Microsoft.Xrm.Sdk.Metadata
  Microsoft.Xrm.Sdk.Query
  Microsoft.Xrm.Tooling.Connector
  Newtonsoft.Json


public List entities;
void Main()
{
var svc = new CrmServiceClient("william@psa365.onmicrosoft.com", CrmServiceClient.MakeSecureString(Util.GetPassword("josh")), "NorthAmerica", "josh1", true, isOffice365: true);
var sourceRequest = new RetrieveOptionSetRequest()
            {
                Name = "f1_resourcetype",
                RequestId = Guid.NewGuid()
            };


             var sourceResponse = (RetrieveOptionSetResponse)svc.Execute(sourceRequest); 
	//connect to target		 
var target = new CrmServiceClient("william@psa365.onmicrosoft.com", CrmServiceClient.MakeSecureString(Util.GetPassword("josh")), "NorthAmerica", "joshdemo", true, isOffice365: true);
var targetRequest = new RetrieveAttributeRequest()
            {
                EntityLogicalName = "bookableresource",
				LogicalName= "resourcetype",
                RequestId = Guid.NewGuid()
            };


             var targetResponse = (RetrieveAttributeResponse)target.Execute(targetRequest); 
			 
			 
  // targetResponse.Dump();
   
			
	OptionSetMetadata optionMetadata = (OptionSetMetadata)sourceResponse.OptionSetMetadata;
	var oldOption1 = optionMetadata.Options.FirstOrDefault(x=> x.Value == 690970000);
	var oldOption2 = optionMetadata.Options.FirstOrDefault(x=> x.Value == 690970001);
	var oldOption3 = optionMetadata.Options.FirstOrDefault(x=> x.Value == 690970002);
	var oldOption4 = optionMetadata.Options.FirstOrDefault(x=> x.Value == 690970003);
	var oldOption5 = optionMetadata.Options.FirstOrDefault(x=> x.Value == 690970004);
	var oldOption6 = optionMetadata.Options.FirstOrDefault(x=> x.Value == 690970005);
	
	
	PicklistAttributeMetadata targetAttribute= (PicklistAttributeMetadata)targetResponse.AttributeMetadata;
    var newOption1= targetAttribute.OptionSet.Options.FirstOrDefault(x=> x.Value == 3); 
	var newOption2= targetAttribute.OptionSet.Options.FirstOrDefault(x=> x.Value == 5); 
	var newOption3= targetAttribute.OptionSet.Options.FirstOrDefault(x=> x.Value == 2); 
	var newOption4= targetAttribute.OptionSet.Options.FirstOrDefault(x=> x.Value == 4); 
	var newOption5= targetAttribute.OptionSet.Options.FirstOrDefault(x=> x.Value == 1); 
	var newOption6= targetAttribute.OptionSet.Options.FirstOrDefault(x=> x.Value == 6); 
	
	//aa.Dump();
	//targetAttribute.Dump();
	var settings = new SolutionMigrationSettings();
	var optionsets = Microsoft.Dynamics365.Migration.Solution.Helper.GetObjectFromJsonFile>(@"C:\Users\v-saimta\Source\Repos\Rangers-F1Migration\Src\FieldService\Data\Metadata Mappings\OptionSetMetadataMappings.json");
		
	var optionSet =	optionsets.FirstOrDefault(e=>e.Old.Name == "f1_resourcetype");
	
	//optionSet.Dump();
	return;
	optionSet.IsDeprecated = false;
 	optionSet.New= targetAttribute.OptionSet;
	
	
	//optionSet.SetValue(new OptionSetMetadataMapping(oldOptionSet, newOptionvalue, false), 0);
	
	//targetAttribute.OptionSet.Dump();
	//optionSet.Dump();
	//return;
	var f1resourcemappings = optionSet.OptionMetadataMappings;
	
	//f1resourcemappings.Dump();
	
	//Array.Resize(ref f1resourcemappings, f1resourcemappings.Length + 1);
	f1resourcemappings.SetValue(new OptionMetadataMapping(oldOption1, newOption1, false), 0);
	f1resourcemappings.SetValue(new OptionMetadataMapping(oldOption2, newOption2, false), 1);
	f1resourcemappings.SetValue(new OptionMetadataMapping(oldOption3, newOption3, false), 2);
	f1resourcemappings.SetValue(new OptionMetadataMapping(oldOption4, newOption4, false), 3);
	f1resourcemappings.SetValue(new OptionMetadataMapping(oldOption5, newOption5, false), 4);
	f1resourcemappings.SetValue(new OptionMetadataMapping(oldOption6, newOption6, false), 5);
	
	
	
	optionsets.FirstOrDefault(e=>e.Old.Name == "f1_resourcetype").OptionMetadataMappings = f1resourcemappings;
	//f1resourcemappings.Dump();
	//optionsets.Dump();
	//return;
	var filePath = @"C:\Users\v-saimta\Desktop\updatedoptions.json";

	var jsonSettings = new JsonSerializerSettings
	{
		TypeNameHandling = TypeNameHandling.Auto,
		ReferenceLoopHandling = ReferenceLoopHandling.Ignore
	};

	File.WriteAllText(filePath, JsonConvert.SerializeObject(optionsets, jsonSettings));
}
  • You can view, edit and format JSON data in JSONBuddy. But citing the size of Custom Metadata mappings, it may be very difficult to tweak any value using JSON editors. To download, click here http://www.json-buddy.com/

Transform Solution

The tool assists in reading unmanaged CRM solutions and Resco Mobile projects from the Source Organizations and transforming them to the content that is compatible with Dynamics 365 for Field Service Migration schema.

Transform CRM Solutions

  • Tool Transforms a CRM Solution and saves the zip file in the working directory.
  • Optionally, exports the solution.
  • Extract/unpack: The solution zip file will be extracted to the working folder.
  • Parse/Transform:
    • Checks the Solution version from Solution.xml file and considers the entity names from the <RootComponent> tags.
    • Checks the solution for SiteMap.xml file.
    • Parse and transform all the files in to the target working folder with same folder structure that of source.
  • Pack: The transformed solution is packed; zip folder will be created in target working folder.

Transform Mobile Solutions:

To Transform entity mobile solutions from the Source environment:

  • Resets loaded solutions and any command executions
  • Handles changes to the project's source organization which should invalidate the last mobile transform result and reset all related collections.
  • Updates an existing file transform result progress or replaces previous progress.
  • Same steps of transform solution are followed.

Note: Browse option to add the solution file in tool is not available for Mobile Solutions.

Solution and Mobile migrations help in migrating all the Schema customizations which helps to keep the target in sync with the customer's environment. Also, for the Data Migration to be performed all the schema should be migrated first to the target environment.

Data Migration

The Data Migration classifies the entities into stages and processes the Stages in the below order. In each stage, entities are further classified based on priority,

  • Security
  • Main
  • Post Entities

Migration is performed in three iterations. It will include required fields and Owner in the first iteration. For each entity, depending on the attribute type, attribute data is migrated.

Then, it converts the collection of Entity records into a collection of requests based on the entity definition type such as Create, Update, Upsert or Associate. Next, it processes the requests as batches and submits the data. Last, it saves the state of data.

During the data migration process, a cache is maintained that tracks the data migration process. If the process is interrupted, the data migration can resume from the cache.

Paging parameters value for the best performance.

Setting

Default Value

Job Max Threads

4

Service Connection Limit

1000

Service Connection Timeout

1200000

Service Expect100Continue

False

Service UseNagleAlgorithm

False

Step Execute Multiple Size

10

Step Max Records to Retrieve

15000

Step Records Per Page

5000

Step Max Threads

8

Step Throttled Max Records to Retrieve

500

Step Max Records Per Page

10

Figure 48 Retrive Multiple Paging Parameters

Post Data Migration Tasks

Following are the necessary tasks which are carried out during this post data migration phase:

  1. For each active msdyn_agreement record, instantiate Field Service - Mark Agreement as Expired workflow.
  2. For each active msdyn_agreement record, instantiate Field Service - Mark Agreement as Expired workflow.
  3. For each msdyn_agreementinvoicesetup record, instantiate Field Service – Generate Agreement Invoice Dates workflow by updating msdyn_PostponeGenerationUntil field to UtcNow.

Known Issues

Specific conditional checks the tool does

Solution.xml

If the below list of entities is present in source for any CRM version, then Migration will be stopped. User should not include these entities in the solution.

  • bookableresource,
  • bookableresourcebooking,
  • bookableresourcebookingheader,
  • bookableresourcecategory,
  • bookableresourcecategoryassn,
  • bookableresourcecategoryassn,
  • bookableresourcegroup,
  • bookingstatus,
  • characteristic,
  • ratingmodel,
  • ratingvalue

SiteMap.xml

If the solution contains SiteMap.xml, Migration will be stopped. User should remove the file from solution and retry, as transformation of this component is not supported via Migration tool.

Special treatment for particular entities/attributes

  • Work Order Schedule entity has an attribute System Status which is deprecated in the target. It has another attribute Schedule Sub-status which is recommended lookup field in source. In target it is mapped to Booking Status attribute which is a required field in Bookable Resource Booking entity. In cases where Schedule Sub-status is null data migration will consider System Status field value.
  • For Resource Type optionset, any custom value (outside 690970000-690970005 range), for data migration phase, we map that to Generic Option value i.e., 1.
  • The booking status records (which are migrated from work order sub status) have the Status field of all records set to Proposed.

    In Field One Sky solution, for entity Work Order Schedule Sub-status - System Status attribute is used in the System Form. Hence, the value would be one of the seven values in Option Set

    In Field Service solution, for entity Booking Status, attribute Status is used instead of Field Service Status in the System Form, hence the value would be one of the three values in Option Set. Default value is proposed.

    Therefore, we consider the value of 'Field Service Status' ('System Status' in source mapped to 'Field Service Status'in target) to populate the value of 'Status' field in target. We set the 'Status' field for all migrated records to 'Committed' unless the 'Field Service Status' field is 'Cancelled' which in that case the Status field should also be 'Cancelled'.

  • For Opportunity, Case, Quote and Order entities one more iteration is performed to update the Closed records. In the first three iterations it migrates the record in 'Active/Draft' state and finally closes in the last iteration through SDK call.

    Extra iteration is performed to migrate the below Entities with corresponding state.

Entity

State

Opportunity

Won

Lost

Case

Resolved

Quote

Won

Closed

Order

Fulfilled

Canceled

Exclusions

While preparing solutions for transformations, please be aware that following components are not transformed by tool and any customizations must be performed manually in target org:

  • Sitemap
  • Reports
  • Custom Code such as plugins, workflow assemblies, JavaScript, integrations, and portals
  • Personal Saved Queries

Appendix

Field Service Solution Install

Issue: While installing Dynamics 365 for Field Service in target organization, installation failed with the log message that Opportunity entity is not enabled for mobile.

Fix: Update Opporunity entity to "Enable for mobile" but selecting this property under Opportunity entity, and try to install the solution again:

Figure 49

Solution import failure for Assembly content size

Issue: Transformed solution import failed with CRM log message "The assembly content size 'x bytes' has exceeded the maximum value allowed for isolated plug-ins y bytes'."

Fix: Open an ICM ticket to increase the import size of isolated plug-ins to more than 'x' bytes and then attempt to import the transformed solution again

Solution import failure for some Field One Sky solution remnants left in Target Org

Issue: While trying to import one of the transformed solutions, got the error message that Solution import failure with log message "The ribbon item 'ifms.incident.Button1.Button' is dependent on Web resource id='f1_icons/32x32/iconWorkOrder32'" for incident entity

Fix: Though Field One Sky has already been uninstalled successfully in Target CRM Org but still some remnants left that was not removed during solution uninstall process. To remove these remnants:

  • Create a solution with Incident entity in Target CRM Org
  • Update solution created to add 'f1_" Publisher and create a dummy 'f1_icons/32x32/iconWorkOrder32' Web Resource
  • Remove all Field One Sky references from all forms in Incident entity
  • Use Ribbon Editor to delete 'ifms.incident.Button1.Button' and publish the solution. Once the button is deleted successfully, try to import your transformed solution and you should not see this issue anymore