Limitations for Lookup Summary techniques?

I have an existing trigger that updates parent/grandparent/great-grandparent records with several totals when a child object is updated. The object relationship looks like this:

Site
 Grade
  Classroom
   Student

All objects are related by lookup. Changes to the Student status cause totals at all levels above student to change. The existing trigger is not reliable. I’d like to replace it with something more maintainable (and correct).

Leading contenders for replacement are:

  • Andrew Fawcett’s “Declarative Rollups for Lookups”. More here.
  • Process Builder and Flow. Example here
  • Some sort of scheduled job that runs multiple times per day

I should note that as many as 3,000 student status changes could happen at (almost) the same time. These bulk updates are generally done with Demand Tools.

Is this volume of changes likely to break either of the first two approaches? Andrew Fawcett’s summary tool is described as being able to handle 50,000 child relation records. Some limits of Processes/Flow are listed here but I’m not certain if any apply to my situation.

Added:

I’m adding a bounty because I’d like to get a better idea of the limits of the options described and the likely failure modes. So what I’d like to know is:

  • Will “Declarative Lookup rollup summary” handle an update of 3,000
    records with summaries at multiple levels without issues? What’s the most records you’ve successfully updated at once in conjunction with this tool?
  • How many updates can a Process
    Builder/Flow based summary support? 50?, 500?, 1000? It has been
    suggested that this approach will almost certainly break for the volume I’m discussing, but it’s not clear where the limits are.
  • How often can one safely update the totals with a scheduled job? In
    total, there might be 20,000 – 30,000 records to summarize. Hourly? Every half hour? more frequently?

Answer

If you do not need “Real Time” updates your best solution would be to use a batch process to cycle through the parent records individually and summarizing the children.

When you attempt to do it with triggers and sync you are going to increase the time it takes to load the data as well as potentially introduce other problems.

Those other problems include CPU Time outs, too many row, etc.

Andrew’s technique is one of the better ones but how well it performs is also a function of the rest of your environment along with any other code that runs along side it. You also run into multiple executions if you have a bunch of inserts that cause updates and updates that cause updates.

Process builder is not ready for this type of work and will lead to premature baldness, do not even try it.

This is a difficult question to answer because the choices are highly dependent on your org and its existing processes and code base.

Attribution
Source : Link , Question Author : Jagular , Answer Author : Eric

Leave a Comment