Governor Grappling
Sooner or later (okay, sooner rather than later) when working in Apex we will need to grapple with Apex Governor Limits
Because Apex runs in a multitenant environment, the Apex runtime engine strictly enforces limits to ensure that runaway Apex doesn’t monopolize shared resources
This is a good thing – it means that a bad tenant (who thinks that bulkification is a dietary term and object orientation refers to feng shui for the desktop) doesn’t affect the good tenants – that’s you and me.
Making Light Work of Heavy Lifting
Sometimes though, the limits may be too restrictive – when we have some heavy lifting to do. Here Asynchronous Apex comes to the rescue, which makes light work of heavy jobs.
As the name implies, Asynchronous Apex means that the work is not done immediately / synchronously, as it is in the normal interactive context for your users. There’s an exchange, a compromise to be made – you accept that you will be prepared to wait a little for the work to be done, which makes it easier for Salesforce to manage demand on resources, and in return Salesforce will give you higher governor limits, as the resources are easier for the platform to allocate. Currently with Async Apex the governor benefits are:
- x6 CPU time
- x2 SOQL queries
- x2 heap size
Now, because the work isn’t done immediately, you need to put some more thought into what happens when the work is finished. With synchronous Apex you can provide immediate feedback to your user – for example if you’re working in a Visualforce page or a Lightning Component. But with async Apex the execution has split off from the user’s path and there’s no direct way of providing them with a response.
You can of course send user notifications by email or Chatter, or otherwise you would have to provide some means in your application to check back on the work being done in the async process.
Future Methods
The future method was the first means provided by Salesforce to do Asynchronous Apex. These methods are really simple to use – annotate a static method in Apex, and when you call it, the method returns immediately, but the work will be done at some point in the not too distant future:
public with sharing class FutureClass
{
@future(callout=true)
public static void myMethod(String s, Integer i)
{
System.debug('Processing primitive variables ' + s + ' and ' + i + ' asynchronously');
// do your stuff
}
}
// usage:
// FutureClass.myMethod('foo',1);
We get the higher async governor limits within the execution of the static method, but there are some limitations / considerations:
- we cannot directly monitor the status of future method execution once we’ve fired it off – we’d have to look for the effects of the method on our data (but what if it fails? – there would be no effects – queue tumbleweed…)
- we cannot be sure of the order that multiple future methods are executed
- parameters can only be primitive types (or collections of primitives)
- there is a limit of 50 future method calls in a single Apex execution
- recommended best practice for future methods is that methods should be designed to execute quickly; therefore if you need to make a (potentially slow) HTTP callout from a future method, you need to declare it in the annotation with “callout=true”
Future Method Documentation
Batch Apex
Batch Apex was provided to handle situations where a similar operation needs to be executed iteratively many times, most typically an operation to be performed on a large set of records (up to 50M records).
A batch job has three parts:
- start by defining the overall scope of the job – typically (but not exclusively) using a SOQL query to select your records
- execute a method repeatedly, each iteration needs to handle a batch of the work that was scoped in the start (again, typically a batch will be a list of sObjects)
- finish by doing any work you need to be done after the last batch
public class MyBatch implements Database.Batchable<sObject>
{
public Database.QueryLocator start(Database.BatchableContext BC)
{
return Database.getQueryLocator('select MyField__c from MySObject__c limit 50000000');
}
public void execute(Database.BatchableContext BC, List<sObject> scope)
{
for(sObject s : scope) { // do something to each record }
update scope; // update the records
}
public void finish(Database.BatchableContext BC) {}
}
// Usage:
// Integer batchSize = 2000;
// ID batchprocessid = Database.executeBatch(new MyBatch(), batchSize);
You fill out the three methods required in a Batch job, and the platform will call them for you, once you’ve submitted the job with the executeBatch call.
Each batch can process between 1 and 2,000 items of work (you choose). The governor limits of course are applied to each batch – that is to a single execute method (rather than to each item in the batch).
You might therefore think that a batch size of 1 would be perfect, as you get the highest governor limits per item of work, but you should avoid really small batch sizes – as the overall job will take much longer to process owing to the “overhead” in starting each batch.
We can also programmatically monitor a batch job while it is running, and abort it if we need to:
Integer batchSize = 2000;
ID batchprocessid = Database.executeBatch(new MyBatch(), batchSize);
AsyncApexJob aaj = [SELECT Id, Status, JobItemsProcessed, TotalJobItems, NumberOfErrors FROM AsyncApexJob WHERE ID =: batchprocessid ];
Boolean ItMakesSense=false;
// make a decision
if(ItMakesSense)
{
System.abortJob(aaj);
}
Whilst batch Apex is most often used to process many sObject records via a SOQL query, it can also be used with an Iterable class so that you can design how you want to define the job and divide it into batches. In this case there’s no batch size limit, but you will likely be restricted by what can be accomplish within the start method (in which you construct your Iterable).
Batch Chaining
But what if 50M records isn’t enough? Or, you have a composite process in mind where you need to first process Opportunities and then Accounts? We will want to create several batch jobs and chain them together, so that one starts on completion of another – by calling Database.executeBatch within the finish method of a batch job.
When batch Apex was first made available we had to cheat to achieve this – we couldn’t do it directly. We would use the Schedulable interface and call System.schedule to kick off the next batch process, say in 5 minutes time.
Today however, we don’t need this workaround; we can chain jobs together by calling Database.executeBatch for one job in the finish method of another.
Whoah there…
The main limitation of batch Apex is that there can be only 5 concurrent batch jobs running in an organisation; this should be a major consideration when developing batch processes as this limit applies across all users and applications.
Given this, we might infer that batch Apex is provided primarily for administrative processes rather then end-user processes, but it is very common to use Batch Apex for routine rather than admin processes.
Batch Apex Examples
Batch Apex Documentation
Queueable Apex – Winter 15
Queueable is the future of future (future^2 ?) and was introduced in the Winter 15 release. It has an implementation looks a lot more like batch Apex (or Schedulable) than future methods:
public class MyQueueable implements Queueable, Database.AllowsCallouts
{
public MyQueueable(Account a)
{
newAccount = a;
}
Account newAccount;
public void execute(QueueableContext context)
{
insert newAccount;
}
}
// Usage:
// ID jobID = System.enqueueJob(new MyQueueable(new Account(Name='Foo')));
// system.debug([SELECT Status,NumberOfErrors FROM AsyncApexJob WHERE Id=:jobID]);
The key features of Queueable Apex are:
- unlike future methods, execution can be monitored: when a job is submitted, an ID is returned to you which you can use to query the AsyncApexJob table in the same way as batch Apex
- future methods were static and took only primitive arguments, but a queueable Apex implementation can effectively pass in complex types like sObjects and custom Apex types (an Account in the above example)
- in common with future methods, there is a limit of 50 executions enqueued within a single Apex execution
- as of Spring 15, queueable executions can be chained together (although in DE orgs the chain limit is 5 jobs)
- although not documented at the time of writing, as with future methods and batch Apex we must declare when we intend to make HTTP callouts – with Database.AllowsCallouts (same syntax as batch Apex)
Queueable Apex Documentation
The New Apex Queueable Interface (Josh Kaplan)
Winter 15 Release Notes
FlexQueue – Spring 15
Hot on the heels of queueable Apex, in the Spring 15 release we have the FlexQueue. At the time of writing, this feature can be seen purely as an enhancement to batch Apex – but at some point in the future we should see queueable Apex join the FlexQueue (see Josh Kaplan’s blog post, linked below).
The FlexQueue provides you with a backlog of up to 100 batch Apex processes, in addition to the 5 “live” concurrent batch Apex jobs.
Say you have 5 batch jobs preparing or processing, and nothing in the FlexQueue, you can submit 100 new batch Jobs and they will go straight into the FlexQueue – previously this would have caused an error.
When jobs are picked from the FlexQueue to be processed, this frees up space in the FlexQueue for further jobs to be submitted.
Jobs in the FlexQueue can be seen in the existing Apex Jobs admin page, and from here they can also be aborted. They can also be seen programatically when querying AsyncApexJob and also aborted programmatically in the same way as any running batch job.
Note that we should take care when aborting a job as the same method aborts a job in the FlexQueue or the currently running Batch jobs – you may set out to abort a job waiting in the FlexQueue but by the time you make the call, the job has started processing and so you end up cancelling a part-complete running batch job.
There’s also a new FlexQueue admin page is provided which can be used to change the order of jobs that aren’t yet being processed.
Flex Your Batch Apex Muscles with FlexQueue (Josh Kaplan)
Spring 15 Release Notes
FlexQueue programmatic control – Winter 16
In the Summer 15 release, a new system method was piloted for controlling the order of jobs in the FlexQueue programmatically. This mirrored the functionality already provided in the FlexQueue UI – specifying a new position for a job in the FlexQueue as an integer:
Boolean isSuccess = System.moveFlexQueueJob(jobId, positionNumber);
The weakness of this is that position numbers change as jobs are picked off the queue – so the method call may not produce the expected results. As a result, this method was not made generally available.
Instead, in the Winter 16 release we have several methods on a new FlexQueue class which provide for control over the order of items by relative rather than absolute position, which is much better:
Boolean isSuccess = FlexQueue.moveBeforeJob(jobToMoveId, jobInQueueId);
Boolean isSuccess = FlexQueue.moveAfterJob(jobToMoveId, jobInQueueId);
Boolean isSuccess = FlexQueue.moveJobToEnd(jobId);
Boolean isSuccess = FlexQueue.moveJobToFront(jobId);
Unfortunately however, there is something missing, which is the ability to determine programmatically the current order of the FlexQueue. This severely limits our ability to build functionality to manage the ordering of jobs. However, from what I understand, this gap is due to be plugged soon.
Winter 16 Release Notes – Reorder Your Batch Jobs in the Flex Queue Programmatically
Winter 16 Release Notes – New Classes and Methods
Enhanced Futures – in some future release
One final feature worth mentioning in passing is Enhanced Futures. This is not yet generally available, but it has been in pilot since the Summer 14 release. Enhanced futures allows a specific future method to increase one (and only one) governor limit by doubling or tripling. That’s doubling/tripling the already superior async governor limits. The enhanceable governors are:
- Heap size
- CPU time
- Number of SOQL queries
- Number of DML statements
- Number of DML records
There is a fuzzy warning linked to this feature: “Running future methods with higher limits might slow down the execution of all your future methods”. I imagine we will get more detail when the feature is released, but clearly the intention is that enhanced limits should be used sparingly, when you really really need them.
Bigger Apex Limits with Enhanced Futures (Josh Kaplan)
Summer 14 Release Notes
Global Async Limit
And finally… in addition to the limits which apply to each async type, there is an overarching limit on the number of asynchronous method executions which can be made in a 24 hour period. This limit is typically 250,000 calls / 24 hours, but higher in large organisations. See Apex Governor Limits for more details.
All async methods come under this limit: future methods, execute Queueable, execute Schedulable, start Batchable, execute Batchable and finish Batchable.
Note that this limit encourages us to obtain high value from each call – rather than firing off a higher number of lower value calls. For example – avoiding small / overly pessimistic batch sizes for Batch Apex jobs.
Most of the content for this post came from the Dreamforce 15 DevZone session: Apex Liberation: The evolution of Flex Queue which I had the great pleasure of presenting with Carolina Ruiz Medina. The session slides are available on SlideShare.
Like this:
Like Loading...