migratortron
TypeScript icon, indicating that this package has built-in type declarations

1.0.0-beta.47 • Public • Published

Migratortron-1000

Clone data or complete sites from one Contensis project to any other!

Contensis CMS Migration Tool

This tool is available as a REST API, cli or as a library for use in your own projects

Things we can migrate

  • Content types
  • Components
  • Content models
  • Content assets
  • Content entries
  • Site view nodes
  • A field in an entry

Entries

How it works:

Create a ContensisRepository connection to the source project and all target projects from data supplied in the payload. Build a transformGuid function with alias and project 'baked in' - required to reliably seed our deterministic guids throughout the process.

HydrateContensisRepositories will hydrate each of these repositories with Projects, Content Types and Components from each Contensis instance, finding dependencies and examining relationships to build complete content Models.

GetEntries will search for entries and load them into the source repository, while searching for the same entries in each target repository (transforming any guids supplied in the query so we can match entries previously created by the tool). Each found entry will be examined for any dependent entries or asset entries and their guids returned. These dependent entries will also be searched for in each repository by their guid to ensure we have all migration dependencies loaded into each repository.

BuildEntries will create a MigrateEntry in each target repository for each entry in the source repository. This MigrateEntry contains two entries, a firstPassEntry and a finalEntry.

Each entry is built by looping through all fields in a matched content type and transforming or censoring the field value, depending on the field type and if we are building an entry ready for a first pass. The same is true for any component fields found and also nested components.

A firstPassEntry will have any found dependencies stripped out and a finalEntry will have only a few types of field tweaked to prevent errors when loading into Contensis. The built entry has each field examined for any asset or entry links and transforms any guid found using a prebuilt transformGuid function attached to each target repository. This built entry is compared to the source entry and a status is set, this can be create, update or two-pass. We can keep track of the entry by its originalId or transformed target id. BuildAssetEntries does a similar job when used with asset entries.

Language differences between source and target are handled by replacing the language keys inside each built content type or entry with the default language from the target project. So you can load content types and entries from a source project with en-GB language into a target project with es-ES language code, and expect everything to be created with es-ES language code set. The same language replacements are made when comparing source and target for differences to determine if we need to make any updates to existing content types or entries.

Two-pass migration status

Each field of every entry will be checked for dependencies and entries that do not already exist will be marked for a "two-pass" migration.

We will create the entry in an unpublished state with all dependency fields removed, this will allow any potential dependencies of the entry to get created before we attempt to create the complete entry with all these links initiated.

After all entries marked for create or update have completed, we will make a second-pass over the entries marked for a "two-pass" migration, this time creating the final entry with all dependencies present. The entry will then be published.

Entry Migration

If a download or commit flag has been provided the process will continue, otherwise the fetched repository data is mapped into a condensed response format and returned to the caller.

DownloadAssetContent will examine any asset entries found in the source repository and - by concatenating assetHostname from the payload with the sys.uri field - download any files to the local file system that have been assigned a create or update status in any of the target repositories.

If a commit flag has been provided the process will continue, otherwise the fetched repository data is mapped into a condensed response format and returned to the caller.

UploadAssetContent will examine any asset entries found in the source repository that have been assigned a create or update status in any of the target repositories and upload the asset, creating or updating the asset entry at the same time.

CommitEntries will load each Migrate entry in each target repository into a promise queue, the queue is then executed in parallel with two execution threads by default, this can be overridden by setting the concurrency option in the payload. Each entry will be loaded by order of their migrate status, creating any new entries first, updating existing entries second and finally updating any stub entries created as part of a two-pass migration with all their dependencies. All entries are published except for entry stubs.

The process currently continues if any errors are encountered, often is the case an error loading an asset or entry early on will likely cause further errors later due to then attempting to create entries containing missing dependencies.

The Management API uses a fetch wrapper enterprise-fetch that employs its own timeout and retry mechanism. A retry policy is in place to timeout any call after 60s and retry any failed call 3 times. We do not retry 404, 409 and 422 errors. We can also divert requests to a proxy such as Fiddler to debug the API requests made by the service.

Entry Dependencies

** any dependency may also appear as an array in a repeatable form of the field.

  • Linked entry
  • Asset
  • Image
  • Component
    • Linked entry
    • Asset
    • Image
    • Nested Component
      • Linked entry
      • Asset
      • Image
  • Composer
    • Linked entry
    • Asset
    • Image
    • Component
      • Linked entry
      • Asset
      • Image
      • Nested Component
        • Linked entry
        • Asset
        • Image

Common Errors

500 Failed to create the entry

"Failed to create the entry" is often caused by the guid already existing somewhere in Contensis, usually a remnant of an old entry in one of the SQL tables. This will happen more often if you are deleting and re-loading the same entries again. If you are using transformGuids: false it is possible that entry already exists in another project.

The entry slug already exists

{
  "message": "There are validation errors creating the entry",
  "data": [
    {
      "field": "slug",
      "message": "The entry slug 'simple-entry' already exists for the language 'en-GB'"
    }
  ]
}

To fix this you normally need to expose the entry title slug in the source content type and update it in the source entry to be unique

Copying an entry field

This feature allows us to copy the contents of one entry field to another, this is useful for example when a field is named incorrectly, or was specified originally as one field type but we would like to curate and present this content differently in future.

Field copy transformation matrix

Copying field data directly from one field to another can only be done with the source and destination field types metioned in the below table

When we copy certain field types, a transformation is made to the data to make it compatible with the destination field type.

Copying a field will overwrite any data in the destination field, it will not preserve or respect any data that currently exists or has been manually entered

Finer grained control of the field data transformation (including field types not supported directly) can be made using a template

source destination notes
string string
stringArray
canvas Content is surrounded within a paragraph block (template can alter the source value)
richText
richTextArray
boolean True if evaluates "truthy" (0, false or null would be false)
stringArray stringArray
string Multiples separated with newline
canvas ^
richText ^
richTextArray
richText canvas
richText
richTextArray
string
stringArray
richTextArray richTextArray
richText Multiples separated with newline
canvas
boolean boolean
string "Yes" or "No"
stringArray ^
integer True = 1, false = 0
integerArray ^
decimal True = 1, false = 0
decimalArray ^
integer integer
integerArray
decimal
decimalArray
boolean True if evaluates "truthy" (0, false or null would be false)
decimal decimal
decimalArray
integer Truncate any decimal precision (e.g. 44.9 = 44)
integerArray ^
boolean True if evaluates "truthy" (0, false or null would be false)
dateTime dateTime
dateTimeArray
image image
imageArray
imageArray imageArray
image
component component Source and destination component must contain the same fields
componentArray ^
component.<field type> <field type> Supports the field types mentioned above
componentArray.<field type> <field type> ^ at the first position in the array
<field type> component.<field type> Adds the field to existing component object or add new component with just this field
componentArray.<field type> ^ at the first position in the array
composer <field type> Not supported
<field type> composer Not supported
canvas <field type> Not supported

Key: ^ = as above

Apply a template

If your field type is not supported above, or you wish to modify the output value for the field we can supply a LiquidJS template where we can make use of "tags" and "filters" available in LiquidJS to perform custom transformations on our entry field data

The result after parsing this template will become the new value for the destination field for every entry

Templates allow us to to make some very precise adjustments to the field value we will update

A number of variables are available to use in the liquid template

  • value - the value of the source field in the entry
  • existing_value - any existing value of the target field in the entry
  • target_value - the value that has been prepared to go into the destination field
  • entry - the entire entry object (if we need to reference another field in the entry)

Examples

These are simple examples of using and chaining LiquidJS filters

  • "{{ value | capitalize }}" will capitalise the first letter of the value
  • "{{ value | downcase }}" will lowercase the entire value
  • "{{ value | downcase | capitalize }}" will lowercase the entire value then capitalise the first letter
  • "{{ value >= 50 }}" using logic based on a source field value we can set a boolean to true or false

Use of LiquidJS tags is also available for more complex scenarios

Apply the template before any field transformation

A special variable is available called source_value (which is the same as value) except this template is parsed and rendered prior to any field transformations taking place. This is useful if you wish to alter the source field value prior to any internal transformations (e.g. before we convert the value to a canvas field).

Using source_value means target_value and value variables are not available.

Examples

  • "<h1>{{ source_value }}</h1>" allows us to surround our source_value with some text before it is converted into the destination field type (e.g. canvas)
  • "{{ source_value | remove: ".aspx" }}" will remove any instance of .aspx from our source value

Transform Composer content with a template

Because of the near infinite flexibility provided by Composer field configurations, in order to transform parts of, or the entire contents of a Composer field in an entry to another field type we can only do this by writing our own template to configure how each item in the Composer is to be "rendered" before adding the transformation result to our destination entry field.

Examples

If we have the following Composer content in JSON containing a number of different data types or "Composer items":

[
  {
    "type": "text",
    "value": "This is my plain text"
  },
  {
    "type": "markup",
    "value": "<p>This is rich <em>text</em> with some <strong>styling</strong></p>"
  },
  {
    "type": "quote",
    "value": {
      "source": "This is the source",
      "text": "This is a quote"
    }
  },
  {
    "type": "number",
    "value": 123456789
  },
  {
    "type": "boolean",
    "value": false
  },
  {
    "type": "location",
    "value": {
      "lat": 51.584151,
      "lon": -2.997664
    }
  },
  {
    "type": "list",
    "value": [
      "Plum",
      "Orange",
      "Banana"
    ]
  },
  {
    "type": "iconWithText",
    "value": {
      "icon": {
        "sys": {
          "id": "51639de0-a1e4-4352-b166-17f86e3558bf"
        }
      },
      "text": "This is my icon text"
    }
  },
  {
    "type": "asset",
    "value": {
      "sys": {
        "id": "e798df96-1de3-4b08-a270-3787b902a580"
      }
    }
  },
  {
    "type": "image",
    "value": {
      "altText": "A photo of Richard Saunders.",
      "asset": {
        "sys": {
          "id": "bc6435eb-c2e3-4cef-801f-b6061f9cdad6"
        }
      }
    }
  }
]

We could supply a template to pull out specific item types into our destination field

The example below will take the list field above and allow the content to be copied into any string type field

{% # use a "for" tag to iterate over our "value" variable (composer field) %}
  {% for c_item in value %}
{% # use an "if" tag to match a composer item type of list in the composer array %}
  {% if c_item.type == 'list' %}
{% # render any list from the composer field, use a "join" filter to convert the value array to a string %}
  {{ c_item.value | join: ', ' }}
{% # close the "if" tag %}
  {% endif %}
{% # close the "for" tag %}
  {% endfor %}

A short hand example similar to the above using only LiquidJS filters

Taking the value (a composer item array), filtering just the composer item types of 'list', mapping just the 'value' taking the first found 'list' and concatenating the values into a comma-separated string.

{{ value | where: 'type', 'list' | map: 'value' | first | join: ', '  }}

So a composer field containing this JSON

  [{
    "type": "list",
    "value": [
      "Plum",
      "Orange",
      "Banana"
    ]
  }]

becomes Plum, Orange, Banana

Or render the same list field data ready to copy into a Rich text or Canvas field, we are free to decorate any value with required markup so it is presented and transformed correctly.

{% for c_item in source_value %} {% if c_item.type == 'list' %}
<ul>
  {% for l_item in c_item.value %}
  <li>{{l_item}}</li>
  {% endfor %}
</ul>
{% endif %} {% endfor %}

Transforming Composer content to Canvas

To transform the above Composer content into a Canvas field, we would need to "render" each item in the Composer that we require in the Canvas field as a very simple HTML representation, and this becomes the value we pass to the HTML parser that in turn renders the JSON that allows us to store the Canvas content in Contensis.

The same kind of theory can be applied to any source field we wish to convert to Canvas content

We must use the source_value variable in the template instead of value variable as the template needs to alter the source value and be applied before the process transforms the value into Canvas

If the source field (or composer item value) is already a rich text field containing existing markup, we don't need to do any special rendering before this is parsed and converted to Canvas content

{% for c_item in source_value %}
  {% if c_item.type == 'markup' %}{{ c_item.value }}
  {% elsif c_item.type == 'quote' %}
    <p>{{ c_item.value.source }}</p>
    <blockquote>{{ c_item.value.text }}</blockquote>
  {% elsif c_item.type == 'image' %}
    <img src='{{ c_item.value.asset.sys.uri }}' alt='{{ c_item.value.altText }}'/>
  {% else %}<p>{{ c_item.value | join: '</p><p>' | json }}</p>
  {% endif %}
{% endfor %}

Embedding Component data in Canvas

We can curate and store Component data inline with Canvas content in the Contensis editor.

Component data will likely be encountered as part of a parent composer field when converting long-form composer-curated content to Canvas.

Following on from the examples above, we have a component in the composer data of type iconWithText. We need to also know the api id of the component (which can be found in the composer field definition in the Content type editor), as the component api id is often different from how it is named in the composer field.

{% for c_item in source_value %}
  {% # ... %}
  {% elsif c_item.type == 'iconWithText' %}
    {{ c_item.value | canvas_component: 'iconWithText' }}
  {% # ... %}
  {% endif %}
{% endfor %}

In the above template when we encounter a composer item with a type of iconWithText we can render it and apply the custom filter canvas_component to the composer item value, supplying the component api id as an argument to this filter

Rendering the component with the custom filter will produce an output that will allow the component (and its content) to be parsed and stored inline in Canvas field content:

<div class='component' data-component='iconWithText' data-component-value='{&quot;icon&quot;:{&quot;sys&quot;:{&quot;id&quot;:&quot;51639de0-a1e4-4352-b166-17f86e3558bf&quot;,&quot;dataFormat&quot;:&quot;entry&quot;,&quot;contentTypeId&quot;:&quot;icon&quot;}},&quot;text&quot;:&quot;This is my icon text&quot;}'></div>

Further component customisation

If you need to customise the component output for the canvas content any further, you can instead not use the suggested custom filter and render the component data as markup following the example output above.

{% for c_item in source_value %}
  {% # ... %}
  {% elsif c_item.type == 'iconWithText' %}
<div class='component' data-component='iconWithText' data-component-value='{{ c_item.value | html_encode }}'></div>
  {% # ... %}
  {% endif %}
{% endfor %}

Another custom filter html_encode used here is provided to help render the data-component-value attribute with the correct encoding to be parsed and embedded into the canvas content

Curate redundant components as canvas

If it is preferred for any reason, instead of embedding component data inline in the canvas content, you could stop using the component field and have the content curated, stored and rendered from regular canvas content blocks going forward.

You would use a template to render the data from each component field wrapped in simple appropriate markup so it will be represented like this within canvas content blocks in Contensis after the field data has been copied and converted to canvas.

Embedding Entry (and asset) links in Canvas

Continuing the example above, we can embed an inline entry link from every matched composer item easily into the canvas content by applying custom filter canvas_entry

{% for c_item in source_value %}
  {% # ... %}
  {% elsif c_item.type == 'entry' %}
    {{ c_item.value | canvas_entry }}
  {% # ... %}
  {% endif %}
{% endfor %}

Produces output similar to the HTML below which can be parsed and saved inside the canvas content

<a class='inline-entry' data-entry='{&quot;sys&quot;:{&quot;id&quot;:&quot;eee9129e-70fc-4f70-b641-01e160af2438&quot;,&quot;dataFormat&quot;:&quot;entry&quot;,&quot;contentTypeId&quot;:&quot;person&quot;}}'></a>

Another example of embedding an entry link into canvas where we could be converting existing rich text content to canvas and need to link/append a certain entry at the bottom of every entry's canvas content.

{{ source_value }}
{{ entry.tsAndCs | canvas_entry }}

If we need to hard code a specific entry id into the canvas after a rich text field:

{{ source_value }}
{% capture link_entry %}
{ "sys": { "id": "eee9129e-70fc-4f70-b641-01e160af2438", "contentTypeId": "person" } }
{% endcapture %}
{{ link_entry | from_json | canvas_entry }}

In the final output we are applying two custom filters to our link_entry, from_json allows us to use a capture tag and hardcode our own json, then parse this as a json object that can be read normally within the template (something which cannot be done natively in LiquidJS).

Further applying canvas_entry filter will convert our parsed JSON object into the markup that is valid for loading with canvas content

{{ source_value }}
{{ '{ "sys": { "id": "eee9129e-70fc-4f70-b641-01e160af2438", "contentTypeId": "person" } }' | from_json | canvas_entry }}

We can achieve the same effect by applying the filter chain to a hardcoded valid JSON string

Embedding Images in Canvas

Continuing with the composer example above, we can embed an existing image into the canvas content by applying custom filter canvas_image

We also need to ensure we have supplied the option to query the delivery api, as entries returned in the management api search do not contain the image uri in any image fields as the delivery api does.

{% for c_item in source_value %}
  {% # ... %}
  {% elsif c_item.type == 'image' %}
    {{ c_item.value | canvas_image }}
  {% # ... %}
  {% endif %}
{% endfor %}

Produces output similar to the HTML below which can be parsed and saved inside the canvas content

<img src='/image-library/people-images/richard-saunders-blog-image.x67b5a698.png' altText='A photo of Richard Saunders.'/>

Images from existing, external or hardcoded content can be added to the canvas by rendering the image details out into valid markup including an <img /> tag with a completed src="" attribute

Complete composer example

{% for c_item in source_value %}
  {% if c_item.type == 'markup' %}{{ c_item.value }}
  {% elsif c_item.type == 'quote' %}
    <p>{{ c_item.value.source }}</p>
    <blockquote>{{ c_item.value.text }}</blockquote>
  {% elsif c_item.type == 'iconWithText' %}{{ c_item.value | canvas_component: 'iconWithText' }}
  {% elsif c_item.type == 'image' %}{{ c_item.value | canvas_image }}
  {% elsif c_item.type == 'asset' or c_item.type == 'entry' %}{{ c_item.value | canvas_entry }}
  {% elsif c_item.type == 'boolean' and c_item.value %}Boolean: Yes
  {% else %}<p>{{ c_item.value | join: '</p><p>' | json }}</p>
  {% endif %}
{% endfor %}

Concatenate multiple entry fields

We can utilise a LiquidJS template to concatenate multiple field values together and copy to a destination field

In the example below we will copy the value of the source field to the destination field but also append any existing value if it exists

{{value}}{% if existing_value %} - {{existing_value}}{% endif %}

Or we can refer to other fields in the entry variable

{{entry.text}}{% if entry.heading %} - {{entry.heading}}{% endif %}

Development

  • Clone the repository
  • npm install
  • npm run build
  • npm test

Using the REST API

  • npm start will build the project and start the api
  • npm run proxy for debugging network calls - this will do the same as npm start except http calls will be passed through a local proxy
  • npm run mock for debugging mocked tests - same as npm start except network calls are disabled and all network responses will be served from recorded mock data
  • npm run record for recording mocks for tests - same as npm start except network calls are captured and saved into the ./nock/recorded/ folder
  • npm run debug for development with hot-reloading - runs tsc in watch mode and starts the api with nodemon
  • npm test will run all mocha test scripts named ./**/*.spec.js

Failed tests

NetConnectNotAllowedError: Nock: Disallowed net connect for "localhost:3001... This error means the underlying network request could not be found in the list of mock requests. This usually means the code has changed in a way that has affected the network calls that are made to the Management API. If it was an intentional change, the failing tests must be "recorded" again by using the npm run record script and then making the same call to the API which will record all the network requests made during the API call, and save them (overwriting the existing mock data in ./nock/recorded folder).

If the changes made did not warrant a knock-on change to the Management API calls then a bug might have been introduced. You can change your code and rebuild the project each time until you can make all tests pass with the existing mock data (or network calls) npm run build, npm test.

Debug individual tests or specs by either adding describe.only( to the failing test"s describe( function or by changing the npm test script in package.json to run the failing test(s) only, when this test eventually passes - revert this package.json change and run all tests again.

Attach a NodeJS debugger to the tests as they run by adding mocha --inspect-brk=9229 to the mocha ... part of the test script in package.json. Each time you run npm test, NodeJS will wait for a debugger instance to become attached before continuing, once a debugger is attached the process will begin, hitting any breakpoints set in code.

Publish version to npm

You can only do this with a "clean" working copy of the project (i.e. there are no uncommitted or modified git-tracked files in the project)

  • npm version {minor|patch|v0.0.0}
    • this will trigger "pre" scripts in package.json
    • build and test the project
    • perform the version bump
    • stage and commit the version-bumped files to git
  • npm publish
    • will publish the new version of the built package to npm.js
  • !! Push the git commits made by the scripts to the origin repository !!

Readme

Keywords

none

Package Sidebar

Install

npm i migratortron

Weekly Downloads

98

Version

1.0.0-beta.47

License

None

Unpacked Size

1.56 MB

Total Files

340

Last publish

Collaborators

  • n.flatley