How can I avoid Sites Limits?

There are limitations on bandwidth and processing time. Has anyone run up against these limits? How do people mitigate the likelihood of those happening?

For the bandwidth limits, I suppose you could host images and other static (non JS of course) files, especially larger files, on a non Salesforce server. While possible, it is not ideal because then you have to pay/maintain for something else. Salesforce is no longer the one-stop shop. (If it isn’t in Salesforce it doesn’t exist, right?)

For the processing, caching can be used in some circumstances, but what if the pages are very dynamic in nature and involve a lot of user interaction? I don’t want to optimize prematurely (it’s the root of all evil, after all).

While I do my best to avoid hitting them, these types of Salesforce limits that affect public facing websites keep me up at night. 😉


I have hit those limits in the past by the nature of the things we were trying to implement. Luckily they are soft limits where a rep will get in touch to talk about increasing bandwith for a fee. If you think your going to be getting close, hedge your bets by letting the client know they might need to purchase extra capacity if their usage increases over time.

Source : Link , Question Author : Peter Knolle , Answer Author : ebt

Leave a Comment