I’m working on an org that will have around 25m opportunities when data migration is complete. I’m trying to anticipate the issues we will face running batch processes against this data volume. We need to run a batch process that will select around 250k opportunities for processing. The query criteria select on the basis of three fields:
- an indexed custom text field with a value that will identify around 400k records
- records where isWon=true
- a picklist value (payment method) which will have low cardinality
My understanding is that no indexes will be used in this scenario because the number of records returned in each case will breach the maximum number of records for an index to be used.
We have looked at the options for increasing the selectivity of the criteria but can’t see an obvious way of doing this and meet the customer requirement.
I have two questions:
- is this scenario a candidate for a skinny table?
- will a skinny table use indexing in a way that other salesforce indexes won’t (does the skinny table in effect become a multi-field index?) so that we can run queries in which an index will return more than 333k records and the index is still used