Share:
Developers » Scripting Code Samples » Bulk Operations

PREVIOUS  |  NEXT

Bulk Operations

Quick Links


In this section, we offer code and testing recommendations for bulk operations. Because of the impact that bulk operations can have on system performance, we provide specific methods you can use in your code and we provide recommendations for performance testing in your Sandbox environment. By testing in your Sandbox, you can significantly decrease the chance of introducing performance issues in your Production tenant after implementation.

We also provide details on bulk operation objects and methods, and we provide details on script execution constraints and how you can request changes to those constraints. 

Bulk Operations Best Practices

[Back to top]

Workflow provides these typical bulk operations:

  • Bulk Case Creation, Update, and Delete operations
  • Bulk File Uploads


When overused, these operations can cause problems, such as performance issues, timeouts, and out-of-memory errors. 

To address these issues, we impose the following constraints to improve performance. We limit the:

  • Script execution timeout to 20 seconds by default
  • Number of instructions that can be executed in a script 
  • Number of asynchronous tasks that can be executed in loops 
  • Number of asynchronous tasks that the same script can execute

For detailed information on the above constraints, you need to open a NetSuite ticket with Customer Support.


In general, you should implement your scripts to ensure:

  • It executes in less than 20 seconds to avoid a timeout 
  • The script execution does not create performance issue
  • The script does not execute more than the allowed number of instructions
  • Scripts do not create more than five cases from the same script

For complex bulk operations, you should utilize the following best practices:

  • Use the bulk operation methods listed below
  • Execute asynchronous tasks, if needed
  • Do not open multiple database connections; rather, only open one database connection and reuse it in your script
  • Do not run SQL operations in loops
  • Use a cache mechanism. For example, if you need to retrieve data from an external database to later use in your script, run one SQL query to retrieve all the needed data from the database and store it in a variable in your script.
  • Minimize the use of logging in long scripts


For more information about requesting modifications to the above constraints, see Request Changes to Scripting Constraints.

Bulk Operation Methods

[Back to top]

Workflow provides the following methods which we highly recommend you use in scripts with bulk operations. You can access these methods from the resp group in the Context Info section of the Reference Tools in the Script Editor.


Bulk Operations with External Database 

[Back to top]

int[] batchUpdate (String[])



Bulk Operations with Custom Tables

[Back to top]

boolean insertBulk (String, List)

boolean updateBulk (String, List)



Bulk Create Cases

[Back to top]

resp.cases.save(List of cases)

See the following code sample for usage of this method in line 13:

def docTemplate = resp.createDocumentTemplate('doc_template_1');
def cases = new java.util.ArrayList();

for (def i = 0; i < 1000; i++ ) { 
    def caze = resp.cases.newCase("dispute");
    caze.name = 'Dispute ' + i;
    docTemplate.addToContext('case', caze);
    def report = docTemplate.generate();
    caze.addAttachment(report, false);
    cases.add(caze);
}

resp.cases.save(cases);


When new cases are created programmatically from an asynchronous task that uses the resp.cases.save(caze) method, it is important to understand that all of the scripts that execute from the Create Case action’s After Workflow Action event will also execute after each case is created. If you are connecting to an external source from one of these scripts, it could be a potential issue for your project. 

The following image depicts the configuration settings for a Create Case action as described above. You can see the scripts that will execute in the Events configuration panel:


Testing Bulk Operations

[Back to top]

It is important that you run performance testing for bulk operations in your Sandbox environment during the implementation phase so you can resolve any issues prior to UAT and before you go live in Production. It is also important that you do the testing on large amounts of data, such as a volume of data that will exist in Production. Testing with a sufficient amount of data will help identify errors in your scripts and will give you a more accurate execution time. With minimal data, you may not be able to detect certain errors or performance issues.

We recommend you conduct thorough testing on your bulk operations. Work with the Project Manager and other teams associated with your project to obtain additional data to test with if necessary.