Tray Embedded / Advanced Topics / Data Mapping / Hardcoded example

Hardcoded example

The following solution imagines a situation whereby:

  • Single customer data objects from a source service are coming into a webhook-triggered workflow,

  • You want to map the fields in that data so it matches the field names (keys) in your destination service.

The incoming data is in the following format:

data-mapping-incoming-webhook-data

While the destination (in this case we are using Airtable as a generic placeholder for any service you may be mapping data to (Salesforce, Hubspot, Marketo etc.)) data is in the following format:

data-mapping-airtable-destination-format

So you can see that we are wanting to map the following fields:

Service 1 Service 2
name account_name
id stripe_id
phone home_phone
email email_address

Creating a data mapping Solution

There are 2 main steps involved in creating a data mapping solution:

  1. Create the source workflow with the 'Mapping' project config and run some test data through it

  2. Turn this workflow into a solution, and make the data mapping config available for End Users to configure

1 - Create and test the workflow

The webhook-triggered workflow for this solution is very simple:

data-mapper-webhook-to-airtable-workflow

The following is a break-down of the three connector steps involved:

2 - Creating the Solution

In the Solution Editor, to set the mapping option available to your End Users, make sure the 'Use data mapping' box is ticked.

Then click on the pencil icon for the 'Mapping' config:

data-mapper-hardcoded-solution-editor-2

You can then set up the hardcoded mapping for the two services:

Remember that the field names for Service 2 must exactly match the column names in your Airtable database:

data-mapping-airtable-destination-format-zoomed

Please see the note on determining service field names below for more guidance on establishing the field name requirements for the source and destination services.

3 - End User activates instance

The final stage of the process is when the End User activates an Instance by starting the Config Wizard, and they can then choose the mapping based on the hardcoded fields you have made available to them:

data-mapper-hardcoded-config-wizard

Implementation notes

Determining service field names

When you are mapping from one service to another, some investigation is required to tell you what these fields are.

So in a case where we are mapping customers from Stripe to Salesforce:

sf-stripe

We can look at the debug output of the List customers operation for the Stripe connector to find that 'name', 'id', 'phone' and 'currency' are what is used by Stripe:

stripe-list-customers-output

To determine the fields that need to be mapped to in Salesforce we inspect the output for the Create Record (Account) operation:

salesforce-create-record-account-output

Note that Stripe_ID__c and Currency__c are custom fields added specifically for mapping from Stripe.

You can then use this to build the list for Service 2:

salesforce-service-2-mappings

Input schema requirements

The first example on this page of mapping data to Airtable is very simple as the Airtable 'Create records' operation accepts data as a simple list of fields.

However some services, such as and , will only accept data as key / value pairs.

In this case you will need to add a second data mapper step which makes use of the 'Map objects to list' operation:

sf-map-key-value-pairs

And the output from the second data mapper can then be used as the input for the fields when creating a record in Salesforce:

sf-key-value-pairs-create-account

As is demonstrated by the debug output from Map key/value pairs:

map-key-value-pairs-debug

Mapping nested data

The above example shows you how to deal with simple 'flat' data payloads.

However, it is likely that you will be receiving payloads of 'nested' data, where fields are contained within objects, such as the following:

nested-payload

In order to deal with this, when converting the Mapping to config in your Source Workflow using the Data Mapper 'Map objects' operation:

nested-mapping-set-config

You can simply use dot notation to specify the object and the fields within:

nested-mapping-dot-notation

These mappings can then be made available for your End Users in the Config Wizard

In Service 1:

nested-mapping-service-1

And Service 2:

nested-mapping-service-2

The following debug output shows this at work:

nested-mapping-debug