Which one is better : Loop with SOQL or create a List then Loop?

Some of code I read in trigger is like this

 for(account a: [select id  from account where id in: mapAccount.keySet()])

some are like

List<Account> listAccount = [select id  from account where id in: mapAccount.keySet()];

 for(Account a: listAccount ){


Are those same from performance view?which one is better?


Both forms are tools, with preferred use cases. You wouldn’t prefer to use a hammer when a screwdriver is better, nor would you prefer a screwdriver when a hammer is better. Generally speaking, prefer the SOQL for loop when memory is at a premium, and the list assignment form when CPU time is at a premium.

Let me give you a concrete example. Let’s say you want to encrypt a bunch of attachments. Your first attempt might look like this:

Attachment[] records = [SELECT Body FROM Attachment];
for(Attachment record: records) {
    record.Body = encrypt(record.Body);
update records;

This works if you have less than 6 MB of files you need to process at once. However, if a user uploads 3 5MB files, you’ll crash, even if you wrote a Batchable class and use a scope size larger than 2.

However, in this rare case, a DML-inside-loop method exists that would theoretically let you handle up to Limits.getLimitDmlStatements records:

for(Attachment record:[SELECT Body FROM Attachment LIMIT :Limits.getLimitDmlStatements()]) {
    record.Body = encrypt(record.Body); // Implementation not important
    update record;

This works because each record is loaded in memory only as needed by the iterator (lazy loading).

A second case also exists that I’ve found. You can search through the results of a query and abort early, and you’ll actually use fewer governor limits than you would normally. I don’t believe I’ve ever found a use case for this in business logic, but it may be worth knowing. For example:

for(Account record:[SELECT Name FROM Account LIMIT 10000 ORDER BY Name ASC]) {
    if(record.Name == 'Breaker') {

The debug logs read a bit confusing, as first you’ll see that 10,000 rows are returned, but some smaller value was actually applied towards the cumulative limits (currently, multiples of 200). Here’s an example from one of my dev orgs:

Execute Anonymous: for(account record:[select name from account]) break;

As you can see, I returned 614 rows, but I aborted early and was only charged for 200 rows. This isn’t something you can do with the list assignment query: you either get all the rows or you blow a governor limit.

Conversely, let’s say you need to update 10,000 accounts quickly for some reason. Using the SOQL loop run a larger risk of blowing the CPU limit because of the overhead of adding records to a list:

// No memory savings, but more CPU time is used
Account[] updates = new Account[0];
for(Account record: [SELECT Name FROM Account LIMIT 10000]) {
    record.Name = record.Name.toUpperCase();
update updates;

// No memory savings, but the code runs faster, and is simpler
Account[] updates = [SELECT Name FROM Account LIMIT 10000];
for(Account record: updates) {
    record.Name = record.Name.toUpperCase();
update updates;

Keep in mind that this difference is really only visible in pages that update a ton of data and triggers when an organization is using a lot of API calls/imports/batches.

So, in conclusion, I’ll say this: the difference usually doesn’t matter, except that one form or the other tends to be clearer to read, depending on what you’re doing. Just use the following guideline: if you need to use the records from the query outside of the loop they’re being processed in, use the list assignment query, otherwise use the SOQL for loop. If you do proper unit testing, you’ll find times when there’s exceptions to this guideline.

Source : Link , Question Author : unidha , Answer Author : sfdcfox

Leave a Comment