Artisan IMG > Salesforce (salesforce) (dd966f42-81e8-4770-a3d8-d095ca41ab45)

Managing API limits

Pulling data and pagination
Copy

When using certain operations such as the 'Find Records' operation, you will find there are 'pagination' options which allow you to pull results from Salesforce in batches.

This means you can limit the number of API calls you make - e.g. you can retrieve 3 batches of 2000 records which match your criteria, instead of making 6000 individual calls for each record.

The following workflow shows a basic pagination system where we are pulling batches of leads from Salesforce with a rating of 'warm' in order to loop through them and send them to another third-party service:

For a full explanation and pre-built pagination workflow please see our 'Paginate through Salesforce Records' template:

Artisan IMG > Salesforce (salesforce) (dd966f42-81e8-4770-a3d8-d095ca41ab45)
Paginate through Salesforce Records
Workflow
Sales
Beginner

Processes Salesforce records in batches when a query has returned multiple pages of results

Details

You can adjust this template to your exact needs any time you need to paginate through batches of pulled SFDC records.

Pushing data and batch / bulk operations
Copy

As mentioned above, when you are pushing large batches of data to Salesforce, you will need to control the rate at which you do so in order not to exceed your API limits.

  • 'Bulk upsert' is effectively two operations in one - if a record of a particular type (e.g. 'lead') is found then it will be updated, if it is not found then a new one of that type (e.g. 'lead') will be created

  • 'Batch Update' / 'Batch Create' can be used together when, in the case of a record of a particular type (e.g. 'account') not being found, you wish to create a record of a different type (e.g. 'lead')

Bulk upsert
Copy

PLEASE NOTE: The 'Bulk upsert' operation only accepts data in CSV format

This demo imagines a scenario whereby, for a particular Salesforce Account, you have pulled together information on a number of the Salesforce contacts for that Account (i.e. the people who work for that company) - from a number of sources (e.g. enrichment info from Zoominfo and Clearbit) and you are wanting to achieve the following:

  • upload this contact information to Salesforce using a limited number of API calls

  • if a contact doesn't exist in Salesforce, then the upsert operation will auto-create a new contact using the information provided

The following screenshot shows that we are picking the workflow up from the point where we have pulled the contact data together into data storage.

We then take the following basic steps:

  1. Fetch the records and count the total number

  2. If the total number of records found is only 1, use the single 'Create/update record' operation

  3. If the total number of records found is more than 1, create a CSV to store the contacts in (this is because data must be in CSV format for the Bulk upsert operation)

  4. Export the CSV file and upload the data to Salesforce using 'Bulk upsert'

In detail the steps being followed are:

IMPORTANT!: In implementing a solution like the above you would need to make sure that you do not overload data storage (the limit for data stored under a single key is 400K). So if you are processing thousands of records you would likely use a Callable Workflow to send them off for sub processing.

Polling a bulk upsert
Copy

When a bulk operation is used, Salesforce does not process the data immediately, instead it starts a bulk data load job. The time in which it takes for this job to finish depends on resources available in your Salesforce instance.

When you use a bulk data operation it receives a Job ID (or just ID as shown in the connector step output).

This job ID can then be used to poll for the status of the job. Only when the job shows a status of JobComplete has the data been successfully processed in Salesforce.

The following workflow shows a Bulk Upsert job has been started - pulling a CSV file from a trigger:

The Repeat polling call step uses the 'Loop Forever' operation.

The Poll SF - check job status step uses the 'Get job info' operation. It pulls in the 'Job ID' using the $.steps.salesforce-1.id jsonpath

On each iteration of the Loop, we check if the job has succeeded by looking at the status field. If the job has completed it will show a status of JobComplete:

Is job complete? is a boolean step which checks if $.steps.salesforce-2.state is equal to JobComplete.

IMPORTANT!: As per Salesforce's API docs the completed status could also be UploadComplete. The following screenshot from the Salesforce docs shows the different job statuses you might check for:

The TRUE branch of the boolean contains the Break loop step (referring to the correct loop! 'loop-1' in this case).

The FALSE branch of the boolean has the Delay and repeat loop step:

You can set the delay to be e.g. 1 minute before the next check.

A key point here is that each check is an API call, so if you are running a large update job you don't want to be checking every 10 seconds!

IMPORTANT!: When using the 'Loop forever' operation to poll for a status you should factor in the possibility that a status will never be reached. To allow for this you should include a check on how long the loop has been running, as illustrated in our Loop Connector documentation .

Get Job Information

Add a final Salesforce connector step to your workflow. This is used to gather the job information (how many file uploads, which failed, time taken etc). Set the operation to 'Get job info' and the 'Job ID' to: $.steps.salesforce-1.id.

In your Debug panel you should see results similar to below:

Batch update / batch create
Copy

As mentioned above, there may be a bulk upload scenario where, if a record is not found then you want to create a record of a different type (which is not possible with 'Bulk upsert').

This section will be a very simple demo of using a combination of the 'Batch update records' and 'Batch create records' operations to achieve this.

IMPORTANT! The Salesforce API will only receive batch update / create lists in an exact format, as explained in the above note on batch operations input structure. The below example shows a way of formatting your input using the Object Helpers 'JSON parse' operation.

IMPORTANT! When processing batches of data using Data Storage lists, you need to be conscious of the fact that the storage limit under one key is 400K. One way to handle this is for batches to be sent for parallel processing to a callable workflow which uses 'current run' data storage that is cleared after each run (as discussed in our guide to workflow threads ). Another approach might be to use the CSV Editor .

The scenario is that a callable workflow is receiving batches of data such as the following:

The idea is that all records with an object_id are pre-existing Salesforce Accounts that need to be updated with Phone and BillingCity.

While records without an object_id do not yet exist in Salesforce and we need to turn them into Leads which might become Accounts at a later stage.

The screenshot below shows a callable workflow which is receiving these batches of records:

The data being received by the Callable Trigger is in json format:

The following steps are taken through the course of the workflow: