How to debug hitting CPU time limit

We have a very slow save happening in Salesforce. Most of the time it succeeds, but is probably just under the CPU limit. If I enable debugging, then the overhead associated with debugging causes it to hit the CPU limit every time. I opened a case with Salesforce and was told this is by design.

I’ve tried to work around this and collect logs anyways looking for slow triggers, loops, etc. The problem I run into is when I set my log levels to a sufficient level of detail (Debug), my logs get truncated and I can’t see the entire picture. In Developer Console, it only shows me 3 seconds worth of Apex code, the rest gets truncated.

Does anyone have any advice on proper log levels to find the loops or slow running process? I’ve tried trial and error and deactivated all processes, workflow field updates, and custom triggers. I still hit the CPU limit due to a number of managed packages in the org. I can’t see the code in those managed packages, but I suspect they are interacting with each other and causing triggers to fire multiple times.

Answer

After debug generated(for CPU time error) open that debug log in developer console
then, Debug–>Switch Perspective–>Analysis
This will help you understand which block of code consumes most of the cpu time.

And if you want to see the debugs then add system.assert() statement followed by the debug statements in the code/block for which you need to check debugs.
all debug statements around the system.assert() call will be there in your debug log.
For example:

system.debug('***getCpuTime()***'+Limits.getCpuTime());
system.assert(false,'Show debug');

Attribution
Source : Link , Question Author : Daniel Hoechst , Answer Author : Reshma

Leave a Comment