Does data referenced from the Platform Cache consume heap space?

Reading Platform Cache Limits, it appears that an individual request can obtain up to 1 MB of data from the Platform Cache:

Maximum local cache size for a partition, per-request 1,000 KB

Local cache
is the application server’s in-memory container that the client
interacts with during a request.

Does that 1 MB consume from 6 MB heap limit?

(My guess is yes, and yes I can write a test to confirm this; looking for a quick answer though if anyone else has explored this.)

Answer

When the data is loaded to memory the heap size is consumed.

I uploaded a book (around 1.5mb) to my org as an attachment, and then split it into several 100kb or less parts to the org’s cache. Turns out the heap is consumed to store the file in memory, but when you put it into the cache, it doesn’t consume more of it.

// Fill in the 2mb cache with a book as attachment
System.debug(Limits.getHeapSize());
Attachment a = [SELECT Id, Body, BodyLength FROM Attachment LIMIT 1];

Integer parts = 100;
Integer partsize = a.BodyLength / parts;

String bodystr = EncodingUtil.base64Encode(a.Body);
System.debug(Limits.getHeapSize());
for (Integer i = 0; i < parts; i++) {
    Integer start = i * partsize;
    Integer endx = (i + 1) * partsize;
    Cache.Org.put('local.LimitTest.bigString' + i, bodystr.substring(start, endx));
}
System.debug(Limits.getHeapSize());

18:53:45:002 USER_DEBUG [2]|DEBUG|1064

18:53:45:202 USER_DEBUG [9]|DEBUG|8256123 (+8255059)

18:53:45:574 USER_DEBUG [15]|DEBUG|8256667 (+544)

Notice how on the third call (after the loop) the difference of used heap size is minimal. That probably means that Salesforce might use pointers internally to reference the cache, or it doesn’t count against the heap because you are just inserting data, not reading it. Anyway…

To retrieve it (in a separate transaction, of course):

// Retrieve it in chunks
System.debug(Limits.getHeapSize());
List<Object> pieces = new List<Object>();
for (Integer i = 0; i < parts; i++) {
    Object part = Cache.Org.get('local.LimitTest.bigString' + i);
    pieces.add(part);
}
System.debug(Limits.getHeapSize());

19:06:19:002 USER_DEBUG [19]|DEBUG|1067

18:50:54:426 USER_DEBUG [22]|DEBUG|4707383

When the file parts are loaded to memory the heap is also used. I believe that answers the question.


Why did you split the file?

If you try to set more than 100kb per .put call you’ll get an ItemSizeLimitExceededException:

cache.ItemSizeLimitExceededException: Value exceeded maximum size limit (100KB): 107.87 KB, 107.9% of limit.

Attribution
Source : Link , Question Author : Keith C , Answer Author : Renato Oliveira

Leave a Comment