Tray Platform / Standard/Best Practices / Managing Data / Data transformation guide

Data transformation guide

Overview

When working with data in Tray, you will often find that you need to carry out certain transformations.

Examples of this are:

  • Filtering data to exclude certain results

  • Enriching data by combining datasets or pulling key extra info from extra sources

  • Formatting data (such as date information) so that it is accepted by the destination service / database

  • Mapping data for when you need to map fields or values to meet the naming protocols of a destination service

  • Structuring data so that it satisfies the input schema of the destination service / database

Filtering data

Include / Exclude items with particular values

For example you may wish to include / exclude all entries in a list where id = null:

filter-include-exclude

In this case you can use the List Helpers 'filter' operation (using either 'inclusive' or 'exclusive' as the filter type):

Or you could use the JSON transformer = and != operators:

Get a list of all available values for a particular field

For example you may wish to get all the possible values for 'title' in a 'staff' dataset:

get-available-values

For this you could use the List Helpers 'Pluck' operation:

Or you could use dot notation with the JSON Transformer:

Formatting Data

Setting date format

There may be times when you receive a date in a particular format, which is then not accepted by a service which you need to pass the date into.

For example you may receive a date in the 'date' format:

'Fri Aug 27 2021' (ddd MMM DD YYY)

But the destination service only accepts it in the dateTime format:

'2021-08-27T12:00:44.000+0000' (yyyy-MM-ddTHH:mm:ss.SSS+/-HH:mm)

In this case you may also need to set a dateTime range as dateTime is a specific time of day.

The example below shows how to get all Salesforce opportunities created on a particular date:

Working with Unix timestamps

When working with Unix timestamps, there are some very quick and easy functions you can use with the JSON transformer:

Transform one field in each object

When you have an array of data you may need to perform a transformation such as the following where you have to convert SKU from a string to a number:

sku-string-to-int

This can be done using the Text Helpers connector in combination with a Loop, Object Helpers and Data Storage (please download and import the sample project to check all step settings and setup):

Or could be done using the JSON transformer:

Mapping Data

Joining fields/values for each item

For example you may wish to concatenate 'first_name' and 'last_name' to get 'full_name'

join-values-together

For this you could use the Loop Connector and Data Storage 'Append to list' operation:

Or you could use the JSON Transformer '&' operator:

Simple field mapping

simple-map--fields

For this you could use the Loop Connector and Data Storage 'Append to list' operation:

Or you could use the JSON Transformer:

Simple value mapping

There may be times when you need to map certain values, as per the following example where you want to map all American states to 'US' and all European countries to 'Europe':

simple-map-values

You could do this with the Data Mapper:

Or using the JSON transformer:

Enriching data

Merge two datasets

You may need to merge two datasets which contain information about the same entities, such as:

merge-datasets

This can be done using the JSON transformer 'merge' function:

Structuring data

Flattening nested data (single object)

Often services return data in a very complex 'nested' format.

You may need to 'flatten' this data into a single layer contained within one object, in order to send to another service, or to make the data more accessible to later steps in your workflow:

tidy-to-simple-object

You could achieve this with the JSON transformer:

Flattening nested data (list)

When sending multiple items of data in lists, some services, particularly SQL databases, require data to be sent in a 'flat' format.

So you may have to carry out a transformation such as the following:

flatten-data

You could achieve this with the Data Mapper:

Or with the JSON transformer:

Satisfying input schema

A lot of services have specific input schema requirements - especially for running batch updates.

For example for the 'batch update' operation, Salesforce would require a transformation such as the following:

satisfy-input-schema

This example shows you how to 'hardcode' the fields to be dealt with in your Salesforce payload.

Please see the Satisfying input schema (changing payloads) example below for a more advanced implementation which automatically picks up the fields present in your payload.

You could solve this by using the Loop Connector and Data storage:

Or you could use the JSON transformer dot notation:

Sending payloads to SQL DBs

Generally speaking you will need to 'flatten' data in order to send to SQL DBs, as in the examples given above.

There may be occasions when you receive payloads of data where the same fields are not always present.

As per the example below where some staff members have location data while others don't, and some have a title and others don't:

simple-changing-payload

When you need to send this data to a SQL database this will often present no issue, as your tables will be correctly set up to handle nullable values.

However, you will need to carry out a transformation if either of the following is the case:

  • The names of the fields are not the same as they are in your destination DB

  • There are some nested values within the data

In this case you can use the above dot notation methods explained in 'Flattening nested data'.

You can simply list out all of the potential mappings using the Data Mapper:

Or using the JSON transformer:

Delete one field in each object

When you have an array of data you may need to perform a transformation such as the following where you have to delete the 'Description' from each object:

transform-delete-function

This can be done using the JSON transformer:

Add (zip) extra fields into each object

In order to satisfy input requirements for certain services you may need to add extra fields to each object, such as the following example when working with Slack modals:

zip-extra-fields

Please see the following video for a detailed walkthrough of this use case:

Managing dynamic data (Advanced)

Satisfying input schema (changing payloads)

The above Satisfying input schema example shows how to restructure a Salesforce payload when you know exactly what fields are coming through.

However there may be times when the fields which come through are not always the same, or you have made an adjustment to the list of fields.

The following screenshot shows a very simple example where 'Phone' is present in one payload but not in another:

sfdc-dynamic-input-schema

We can take care of this using JSONata dot notation and a few built-in functions: