The salesforce batch sink is responsible for using Salesforce API to upsert salesforce objects. The sink should handle large batches of data (~10 GB) and should handle all object types - contacts, campaigns, oppurtunities, leads, custom objects. Users should be able to upload subset of fields
...
Section | User Configuration Label | Description | Default | User Widget | Early Validations |
---|---|---|---|---|---|
Authentication | Username | Salesforce username | Text Box | Try a to login to bulk API with given credentials. | |
Password | Password | ||||
Consumer Key | Consumer Key from the connected app | Text Box | |||
Consumer Secret | Consumer Secret from the connected app | Password | |||
Login Url | For Salesforce sandbox runs login url is different. That's why user needs to have this option. | https://login.salesforce.com/services/oauth2/token | Text Box | ||
Advanced | SObject | Name of Salesforce sObject - ex: Contact, Campaign, Oppurtunity. | Text Box | Check if sObject with given name exists in Bulk API. | |
Maximum bytes per batch | If size of batch data is larger than given number of bytes, split the batch. | 10,000,000 [2] | Text Box | If more than 10,000,000 than fail [3] | |
Maximum records per batch | If there are more than given number of records, split the batch. | 10,000 [2] | Text Box | If more than 10,000 fail [4] | |
Error handling | Bulk API will return success results per row so this is necessary [1] (unlike for source plugins). Possible values: "Skip on error" - ignores any reports about records not inserted. Simply prints an error log. | Skip on error | Select |
...
Anchorsplit_data split_data
STEP 3. Split data into batches
split_data | |
split_data |
Here we create multiple CSV filesstrings with CSV data. Every of them contains multitude of records. Every string represents data for a separate batch.
Splitting data into batches is done considering these 3 factors:
- User configurations "Maximum bytes per batch" and "Maximum records per batch" must be obeyed.
- Bulk API limitations must be obeyed. Here's the list of them:
- Batches for data loads can consist of a single CSV file that CSV that is no larger than 10 MB.
- A batch can contain a maximum of 10,000 records.
- A batch can contain a maximum of 10,000,000 characters for all the data in a batch.
- A field can contain a maximum of 32,000 characters.
- A record can contain a maximum of 5,000 fields.
- A record can contain a maximum of 400,000 characters for all its fields.
- A batch must contain some content or an error occurs.
A,B,C - controlled by splitting the data correctly.
E - checked during schema validation.
D,F,G - if the data, which comes from sink, exceeds these, simply let the batch fail. Nothing we can do about these. - How many records comes to a specific mapper. For more details see <TODO>.
STEP 4. Add batches to CSV
Pass a FileInputStream ByteArrayInputStream of csv file string to Bulk API and ask it to create a batch using this fileit.
STEP 5. Close Job
Ask Bulk API to close the job. This means that no more batches will be expected by Salesforce.
...
Code Block |
---|
Id,Success,Created,Error fa4t2fggee,true,true, rqewetrter,true,true, ,false,false,Field 'Name' is required and cannot be empty for sObject 'Contact' gre3jvd245,true,true, |
Records which has either success=false OR created=false are has success=false are considered erroneous. Erroneous are processed according to user configuration "Error handling". For more information look at section "User configurations".
...
RecordWritter#constructor
Create a tmp folder and establish Establish connection to Salesforce Bulk API
RecordWritter#write
Append a record to a csv fileStringBuilder. If according to the our batching policy we need the record to go into the new batch, we close the file and submit it submit the CSV string to the Salesforce job as a separate batch. After that the new file string is created and the record is appended to it.
For information on how we calculate batches please see Splitting data into batches.
RecordWritter#close
- Close currently opened csv file and submit it as Submit current CSV StringBuilder as a batch to Salesforce job.
- Wait for completion of EVERY batch which was submitted by current mapper.
- Check results for every record in EVERY batch (submitted by current mapper), and act on them according to the error handling strategy configured by user.
Other points
Anchor #unknown_fields #unknown_fields
#unknown_fields | |
#unknown_fields |
...
Some points on behavior if schema contains fields which are not present or are not editable (creatable) in target sObject.
Let's consider a case where user wants to copy Contacts from one Salesforce instance to another. This is done by simply connecting Salesforce Source and Sink in ETL.
...
Validating schema
SObject contains a lot of fields which cannot be inserted (non-creatable fields) like Id, isDeleted, LastModified LastModifiedDate and a lot of other fields which are often auto-generated.
These fields are different for every sObject. Good news is that we can query Salesforce SOAP API and check if any field in sObject is creatable.
Based on above said, I propose that we skip any non-creatable fields or fields that do not exist. We simply produce a log message with the list of fields that were ignoredWe do the early validation and check if schema contains fields which are not present or not creatable in target sObject.
Converting fields
We will have to convert logical types like date, datetime, time from long to string format accepted by Salesforce. Other types won't require converting.
...