Pushing data in batches (chunking)

                                        When sending batches of data to create or update records, it is important to note that services will often have a limit on the number of records that can be updated in one call.

                                        For example the Salesforce 'batch create' operation only allows you to update 200 records at a time.

                                        So if we try to batch create 1000 records we will see the following error:

                                        So we can use the List Helpers 'Chunk' operation to divide the 1000 records into batches of 200.

                                        The 5 batches of 200 records can then be looped through.

                                        Each batch of 200 can then be transformed to meet the input schema requirements, and passed to Salesforce 'batch create':

                                        If your processing requirements are complex and / or you are dealing with large volumes of data, it would be best practice to send the chunked batches of data to a callable workflow to enable efficient parallel processing.