Best Practices and Limitations

Consider these guidelines when usingBatchJobService.

Improve throughput

  • Fewer larger jobs is preferred over many smaller jobs.

  • Order uploaded operations by operation type. For example, if your jobcontains operations to add campaigns, ad groups, and ad group criteria,order the operations in your upload so that all of thecampaignoperations are first, followed by all ofthead group operations, and finally allad group criterion operations.

  • Within operations of the same type, it can improve performance to group themby parent resource. For example, if you have a series ofAdGroupCriterionOperation objects, it can be more efficient to groupoperations by ad group, rather than intermixing operations that affect adgroup criteria in different ad groups.

Avoid concurrency issues

  • When submitting multiple concurrent jobs for the same account, try to reducethe likelihood of jobs operating on the same objects at the same time, whilemaintaining large job sizes. Many unfinished jobs, which have the status ofRUNNING,try to mutate the same set of objects, which can lead to deadlock-likeconditions resulting in severe slow-down and even job failures.

  • Don't submit multiple operations that mutate the same object in the samejob, as the result can be unpredictable.

Retrieve results optimally

  • Don't poll the job status too frequently or you risk hitting rate limiterrors.

  • Don't retrieve more than 1,000 results per page. The server could returnfewer than that due to load or other factors.

  • The results order will be the same as the upload order.

Additional usage guidance

  • You can set an upper bound for how long a batch job is allowed to run beforebeing cancelled. When creating a new batch job, set themetadata.execution_limit_secondsfield to your preferred time limit, in seconds. There is no default timelimit ifmetadata.execution_limit_seconds is not set.

  • It is recommended to add no more than 1,000 operations perAddBatchJobOperationsRequestand use thesequence_tokento upload the rest of the operations to the same job. Depending on thecontent of the operations, too many operations in a singleAddBatchJobOperationsRequest could cause aREQUEST_TOO_LARGE error. Youcan handle this error by reducing the number of operations and retrying theAddBatchJobOperationsRequest.

Limitations

  • EachBatchJob supports up to one millionoperations.

  • Each account can have up to 100 active or pending jobs at the same time.

  • Pending jobs older than 7 days are automatically removed.

  • EachAddBatchJobOperationsRequesthas a maximum size of 10,484,504 bytes. If you exceed this, you will receiveanINTERNAL_ERROR. You can determine the size of the request beforesubmitting and take appropriate action if it is too large.

    Java

    staticfinalintMAX_REQUEST_BYTES=10_484_504;...(codetogettherequestobject)intsizeInBytes=request.getSerializedSize();

    Python

    fromgoogle.ads.googleads.clientimportGoogleAdsClientMAX_REQUEST_BYTES=10484504...(codetogettherequestobject)size_in_bytes=request._pb.ByteSize()

    Ruby

    require'google/ads/google_ads'MAX_REQUEST_BYTES=10484504...(codetogettherequestobject)size_in_bytes=request.to_proto.bytesize

    PHP

    use Google\Ads\GoogleAds\V16\Resources\Campaign;const MAX_REQUEST_BYTES = 10484504;... (code to get the request object)$size_in_bytes = $campaign->byteSize() . PHP_EOL;

    .NET

    usingGoogle.Protobuf;constintMAX_REQUEST_BYTES=10484504;...(codetogettherequestobject)intsizeInBytes=request.ToByteArray().Length;

    Perl

    useDevel::Sizeqw(total_size);useconstantMAX_REQUEST_BYTES=>10484504;...(codetogettherequestobject)my$size_in_bytes=total_size($request);

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-07-17 UTC.