JavaScript ECMA6 Asynchronous Calls
Here’s a basic example of how you can execute a stored procedure asynchronously to insert customer test data into a database using JavaScript ECMA6 methods and keywords.
1. Create Customer Data in a map object:
The MakeCustomers function generates a list of customer data and passes it to the runProceduresForCustomer function for further processing, such as inserting this data into a database using stored procedures.
function MakeCustomers()
{
try
{
// --- Create a map object of static customer data ---
Log.Message("Create customer data.")
const customerDetailsArray = [];
let numberOfCustomers = 5
for (let i = 1; i <= numberOfCustomers; i++) {
customerDetailsArray.push({
Name: i + " Scott Holmes Test " + generateRandomString(10), // calling a simple script to return a random text string.
Address: "1 Main St",
City: "City1",
State: "MO",
ZipCode: "11111",
PhoneNumber: 1234567890,
Email: `${Sys.UserName}.@wwinc.com`,
TaxCode: 1,
TermsID: 2,
CreateBy: Sys.UserName,
CreateWhere: Sys.HostName,
UpdateBy: Sys.UserName,
UpdateWhere: Sys.HostName,
ShippingZoneID: 1,
RunNumber: 1,
ShipViaTypeID: 1
});
}
// --- The AI Data Generator could be used here with a few mods ---
// --- Requires IQ add-on ---
// --- Data Generator AI extention to be enabled ---
runProceduresForCustomer(customerDetailsArray);
}
catch(err)
{
Log.Error('function MakeCustomers: ', err);
}
}
function generateRandomString(length = 10) {
const characters = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789 ';
let result = '';
for (let i = 0; i < length; i++) {
const randomIndex = Math.floor(Math.random() * characters.length);
result += characters[randomIndex];
}
return result;
}
2. Running Stored Procedures Asynchronously:
The function runProceduresForCustomer performs two main tasks:
- Creates an array of stored procedure connections.
- Calls to execTestCustomerInsertAsync(sProc, customerDetails) to insert data using these connections.
It is structured to handle multiple customer operations in parallel, leveraging async/await and Promise.all for efficient asynchronous execution.
// Async function to create and execute procedures concurrently
async function runProceduresForCustomer(customerDetailsArray) {
try {
Log.Message("Create an array of promises containing sProc connections and db insertResults.")
// Array to hold promises
const promises = customerDetailsArray.map(async (customerDetails) => {
// Create a new stored procedure object for each customer
let sProc = ADO.CreateADOStoredProc();
sProc.ConnectionString = "Provider=MSOLEDBSQL;Data Source=ProdSQL;Initial Catalog=TestingTeam;Integrated Security=SSPI";
sProc.CommandTimeout = 120; // Set timeout for the connection
// Execute the TestCustomer_Insert stored procedure
const insertResult = await execTestCustomerInsertAsync(sProc, customerDetails);
Log.Message(insertResult);
// Close the db connection for this sProc object
sProc.Close();
});
// Wait for all customer operations to complete
await Promise.all(promises);
Log.CheckPoint('All customer creation and synchronization procedures completed successfully.');
} catch (err) {
Log.Warning('Error occurred during procedure execution:', err);
}
}
The execTestCustomerInsertAsync function executes a stored procedure ([Cabinet].[TestCustomer_Insert]) for inserting customer details into a database. It wraps this logic in a Promise to allow asynchronous handling.
This function:
- Prepares a stored procedure object (sProc) with the necessary parameters.
- Executes the stored procedure.
- Resolves or rejects a Promise based on the success or failure of the execution.
function execTestCustomerInsertAsync(sProc, arrCustomerDetails)
{
return new Promise((resolve, reject) => {
try {
sProc.ProcedureName = "[Cabinet].[TestCustomer_Insert]";
// Define and Add new parameters
let parameterArr = [
["Name", ftString, 100, arrCustomerDetails.Name],
["Address", ftString, 500, arrCustomerDetails.Address],
["City", ftString, 50, arrCustomerDetails.City],
["State", ftString, 50, arrCustomerDetails.State],
["ZipCode", ftString, 50, arrCustomerDetails.ZipCode],
["PhoneNumber", ftLargeInt, 50, arrCustomerDetails.PhoneNumber],
["EmailAddress", ftString, 100, arrCustomerDetails.Email],
["TaxCodeID", ftInteger, 50, arrCustomerDetails.TaxCode],
["TermsID", ftInteger, 50, arrCustomerDetails.TermsID],
["CreateBy", ftString, 50, arrCustomerDetails.CreateBy],
["CreateWhere", ftString, 100, arrCustomerDetails.CreateWhere],
["UpdateBy", ftString, 100, arrCustomerDetails.UpdateBy],
["UpdateWhere", ftString, 100, arrCustomerDetails.UpdateWhere],
["ShippingZoneID", ftInteger, 50, arrCustomerDetails.ShippingZoneID],
["RunNumber", ftInteger, 50, arrCustomerDetails.RunNumber],
["ShipViaTypeID", ftInteger, 50, arrCustomerDetails.ShipViaTypeID]
];
let i = 0;
parameterArr.forEach(item => {
sProc.Parameters.AddParameter();
sProc.Parameters.Items(i).name = item[0];
sProc.Parameters.Items(i).DataType = item[1];
sProc.Parameters.Items(i).Size = item[2];
sProc.Parameters.Items(i).Value = item[3];
i++;
});
// Execute the procedure
sProc.ExecProc();
// Resolve the promise upon success
resolve(`Customer ${arrCustomerDetails.Name} inserted successfully`);
}
catch (err) {
// Reject the promise if there is an error
Log.Warning(`Error inserting customer ${arrCustomerDetails.Name}: ${err.message}`);
reject(`Error inserting customer ${arrCustomerDetails.Name}: ${err.message}`);
}
});
}
3. Preventing Locking Issues in ADO
When dealing with concurrent operations in a multi-user environment, locking can become a major issue. ADO provides several mechanisms to handle and prevent locking:
Isolation Levels
Isolation levels control how transactions are isolated from each other. Using the appropriate isolation level helps avoid unnecessary locks.
For example, Read Committed isolation prevents dirty reads while still allowing shared locks:
sProc.Connection.BeginTrans(adXactReadCommitted); // Start transaction
Locking Modes
Locking modes define how and when records are locked:
- Optimistic Locking (adLockOptimistic): Locks are applied only when updates are made, allowing for concurrent reads.
- Pessimistic Locking (adLockPessimistic): Locks are applied as soon as the record is read.
sProc.LockType = adLockOptimistic; // Use optimistic locking to minimize contention
Connection Pooling
Using connection pooling can help reduce overhead when opening and closing connections. By reusing connections, you can minimize the number of simultaneous open connections and reduce locking issues.
Provider=MSOLEDBSQL;Data Source=ProdSQL;Initial Catalog=TestingTeam;Integrated Security=SSPI;Pooling=true;
4. Batch Processing for Large Data Sets:
Processing too many connections at once can overwhelm your database. To avoid this, you can limit the number of concurrent connections by processing customers in batches.
async function runProceduresForCustomerInBatches(customerDetailsArray, batchSize = 10) {
try {
for (let i = 0; i < customerDetailsArray.length; i += batchSize) {
const batch = customerDetailsArray.slice(i, i + batchSize);
const promises = batch.map(async (customerDetails) => {
let sProc = ADO.CreateADOStoredProc();
sProc.ConnectionString = "Provider=MSOLEDBSQL;Data Source=ProdSQL;Initial Catalog=TestingTeam;Integrated Security=SSPI";
sProc.CommandTimeout = 120;
const insertResult = await execTestCustomerInsertAsync(sProc, customerDetails);
Log.Message(insertResult);
sProc.Close();
});
await Promise.all(promises); // Process each batch of customers
}
Log.CheckPoint('All procedures completed successfully.');
} catch (err) {
Log.Warning('Error occurred during procedure execution:', err);
}
}
This ensures that you control the number of concurrent operations, reducing the load on your database while still improving performance over sequential execution.
Test Log Screen Shots...
Conclusion:
By leveraging TestComplete’s ADO support with asynchronous JavaScript, connection pooling, and batch processing, you can efficiently handle large-scale database operations. These techniques not only help prevent locking issues but also improve the performance and scalability of your database interactions.
With these best practices in place, you can confidently manage high-volume data insertions and concurrent operations in a multi-user environment.
Christopher S. Holmes
Dev Support Analyst
WW Wood Products inc.