database - Mule: Inserting 470.000 records into Salesforce, that only allows 200 records per iteration -


i hava flow in mule gets records db around 470.000 records. these records have put salesforce. salesforce allows me insert 200 records per iteration ! unfortunately have can not these records in 1 db call, overloads machine memory. idea use "foreach" component in mule keep on looping until less 200 records db call.

how can these been accomplished ?

my config flow here:

<foreach batchsize="200" doc:name="for each" countervariablename="foreachcount">         <jdbc-ee:outbound-endpoint exchange-pattern="request-response" querykey="select200records" querytimeout="-1" connector-ref="postgresconnector" doc:name="database">             <jdbc-ee:query key="select200records" value="select * parties limit 200 offset #[variable:offset]"/>         </jdbc-ee:outbound-endpoint>         <set-variable variablename="dbpayload" value="#[payload]" doc:name="variable"/>         <scripting:component doc:name="script">             <scripting:script engine="jython"><![cdata[result = len(payload)]]></scripting:script>         </scripting:component> .... 

salesforce has variety of batch api limits

batches data loads can consist of single csv or xml file can no larger 10 mb.

  • a batch can contain maximum of 10,000 records.
  • a batch can contain maximum of 10,000,000 characters data in batch.
  • a field can contain maximum of 32,000 characters.
  • a record can contain maximum of 5,000 fields.
  • a record can contain maximum of 400,000 characters fields.
  • a batch must contain content or error occurs.

if using mule salesforce connector there bulk option recall allows more default batch size.


Comments

Popular posts from this blog

curl - PHP fsockopen help required -

HTTP/1.0 407 Proxy Authentication Required PHP -

c# - Resource not found error -