Why am I double charged on DML limits in a process builder?

This question is a follow-up from another question where the accepted answer states that for each Update Records action in a process, following limits are consumed:

As you can therefore conclude, it is 1 query and 1 DML statement per
chunked transaction.

I was running some benchmark tests with process builder in a brand new dev org that has no other automations running. The process itself is super simple, firing on every account insert/update and updating 1 field:

enter image description here

I was then inserting a list of 200 accounts:

List<Account> accounts = new List<Account>();
for(Integer i = 0; i < 200; i++){
    Account a = new Account(Name='Account '+i);
    accounts.add(a);
}
insert accounts;

The logs show the following:

enter image description here

1 DML and 200 DML rows are used by the initial insert, the rest is caused by the process. 1 SOQL and 200 SOQL is in line with what’s expected. However, that’s leaving 2 DML and 200 DML rows.

The execution log is showing the following:

enter image description here

At the very end, a DML is used with no additional rows consumed.

I then did the same test, but now inserting 199 accounts. Now the process consumes 1 DML and 199 DML rows. Testing with 201 accounts, I get these results: 3 DML and 201 DML rows. I get similar results around the 400 – 600 – … treshold.

To me, it looks like when arriving at the 200 records mark, Salesforce is already enqueuing the next chunk of records, which consumes a DML, without checking if there really are more records to process? Then if there are any, that’s one more DML. Who can shed some light on this?

Answer

Your assumption is correct. SF works with batches of 200 records. There are few other situations you should pay attention to.

  1. If you push 201 records it will call two dmls.
  2. if you for example push List< sObject > with multiple kind of objects like accounts and contacts at same time. It will call new batch each time it will reach new kind of object in list. So remember to sort the list and take that in to counting limits. Cause for example list with 100 records contacts and accounts set up the way that there are next to each other ( i mean liek this : List: Acc1,Con1, Acc2 , Con2 … ) will reach limits as well. Cause of new batch for each sObject type.

https://help.salesforce.com/articleView?id=process_limits.htm&type=5 <- here we can read that each update statement on process takes 1dml and 1soql.

So sorry for missing a good point in previous comment. So to fallow it step by step:
1. You insert all 200 accounts ( DML Rows should be 200 and DML statements 1 ).
2. Then for each of them as batch triggers you process and updates them ( DML Rows should be 400 and DML Statements 2 ) .
3. My guess is! That maybe the process triggers again. It does update but. Since there are no changes it is just using the dml and that is all. Could you please add condition so the process will not trigger if the field you update with it has been changed ?

Attribution
Source : Link , Question Author : Robin De Bondt , Answer Author : Marcin Trofiniak

Leave a Comment