Bulk/Collection Processing in Workflows
Last Updated: January 9, 2026
Learn how to process large collections (hundreds to thousands of items) efficiently in BPM workflows using batch processing patterns.
Overviewβ
When building workflows that need to process collections of items (e.g., 2000 loan accounts for a collections platform), you have two architectural choices:
| Approach | Description | Recommendation |
|---|---|---|
| Multiple Workflows | Create separate workflow instance for each item | Γ’ΒΕ NOT RECOMMENDED |
| Single Workflow with Iteration | One workflow that iterates through collection | Γ’Εβ¦ RECOMMENDED |
Γ’ΒΕ Anti-Pattern: Multiple Workflow Instancesβ
Don't Do Thisβ
Implementation details removed for security.
Contact support for implementation guidance.
Why This Is Badβ
- Γ’ΒΕ Database Overhead: 2000 records in
ProcessInstancetable - Γ’ΒΕ Memory Issues: Massive memory consumption
- Γ’ΒΕ No Progress Tracking: Can't see overall completion status
- Γ’ΒΕ No Atomicity: Can't rollback entire operation
- Γ’ΒΕ Complex Error Handling: Must track failures across 2000 instances
- Γ’ΒΕ Performance: Slow startup and coordination overhead
Γ’Εβ¦ Recommended Pattern: Single Workflow with Batchingβ
Workflow Architectureβ
Γ’βΕΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ
Γ’ββ Bulk Collection Processing Workflow Γ’ββ
Γ’ββ (Single Instance) Γ’ββ
Γ’ββΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΛ
Γ’ββ
Γ’βΒΌ
Γ’βΕΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ
Γ’ββ Start Event Γ’ββ
Γ’ββΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’β¬Òββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΛ
Γ’ββ
Γ’βΒΌ
Γ’βΕΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ
Γ’ββ Initialize Batches Γ’ββ
Γ’ββ (ScriptTask) Γ’ββ
Γ’ββ Γ’ββ
Γ’ββ Γ’β¬Β’ Split into batches Γ’ββ
Γ’ββ Γ’β¬Β’ Set counters to 0 Γ’ββ
Γ’ββ Γ’β¬Β’ Calculate total Γ’ββ
Γ’ββΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’β¬Òββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΛ
Γ’ββ
Γ’βΒΌ
Γ’βΕΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ
Γ’ββ Has More Batches? Γ’ββ
Γ’ββ (Exclusive Gateway) Γ’ββ
Γ’ββΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’β¬Òββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΛ
Γ’ββ
Γ’βΕΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’β´Òββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ
Γ’ββ YES Γ’ββ NO
Γ’βΒΌ Γ’βΒΌ
Γ’βΕΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ Γ’βΕΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ
Γ’ββ Process Batch Γ’ββ Γ’ββ Generate Report Γ’ββ
Γ’ββ (ScriptTask) Γ’ββ Γ’ββ (ScriptTask) Γ’ββ
Γ’ββ Γ’ββ Γ’ββΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’β¬Òββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΛ
Γ’ββ Γ’β¬Β’ Loop through 100 Γ’ββ Γ’ββ
Γ’ββ Γ’β¬Β’ Execute commands Γ’ββ Γ’βΒΌ
Γ’ββ Γ’β¬Β’ Track results Γ’ββ Γ’βΕΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ
Γ’ββΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’β¬Òββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΛ Γ’ββ Send Email Γ’ββ
Γ’ββ Γ’ββ (SendTask) Γ’ββ
Γ’βΒΌ Γ’ββΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’β¬Òββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΛ
Γ’βΕΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ Γ’ββ
Γ’ββ Update Progress Γ’ββ Γ’βΒΌ
Γ’ββ (ScriptTask) Γ’ββ Γ’βΕΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ
Γ’ββ Γ’ββ Γ’ββ End Event Γ’ββ
Γ’ββ Γ’β¬Β’ Log progress Γ’ββ Γ’ββΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΛ
Γ’ββ Γ’β¬Β’ Update database Γ’ββ
Γ’ββΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’β¬Òββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΛ
Γ’ββ
Γ’ββΓ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’ββ¬Γ’βΒ
Γ’ββ
Γ’βΒΌ
(Loop back to Gateway)
Implementation Stepsβ
Step 1: Initialize Batches (ScriptTask)β
Task Name: Initialize Batches
Task Type: ScriptTask
// Split collection into manageable batches
var BATCH_SIZE = 100; // Process 100 items at a time
var items = context.items; // Input: collection of 2000 items
return {
allItems: items,
totalItems: items.length,
batchSize: BATCH_SIZE,
currentBatchIndex: 0,
totalBatches: Math.ceil(items.length / BATCH_SIZE),
processedCount: 0,
successCount: 0,
failureCount: 0,
results: []
};
What It Does:
- Takes input collection (e.g., 2000 loan accounts)
- Calculates number of batches (2000 ΓΒ· 100 = 20 batches)
- Initializes tracking counters
Step 2: Check If More Batches Exist (Exclusive Gateway)β
Gateway Name: Has More Batches?
Gateway Type: Exclusive (XOR)
Outgoing Flow 1 (YES):
- Condition:
context.currentBatchIndex < context.totalBatches - Target: Process Batch task
Outgoing Flow 2 (NO):
- Condition:
context.currentBatchIndex >= context.totalBatches - Target: Generate Report task
// Gateway expression
context.currentBatchIndex < context.totalBatches
Step 3: Process Batch (ScriptTask)β
Task Name: Process Batch
Task Type: ScriptTask
// Get current batch of items
var startIdx = context.currentBatchIndex * context.batchSize;
var endIdx = Math.min(startIdx + context.batchSize, context.totalItems);
var batch = context.allItems.slice(startIdx, endIdx);
console.log('Processing batch ' + (context.currentBatchIndex + 1) +
' of ' + context.totalBatches +
' (items ' + startIdx + ' to ' + endIdx + ')');
// Process each item in this batch
batch.forEach(function(item) {
try {
// Execute business logic command
var result = doCmd('ProcessLoanDefault', {
loanId: item.loanId,
customerId: item.customerId,
amountDue: item.amountDue
});
if (result.isSuccessful) {
context.successCount++;
context.results.push({
itemId: item.id,
status: 'SUCCESS',
message: result.message
});
} else {
context.failureCount++;
context.results.push({
itemId: item.id,
status: 'FAILED',
error: result.message
});
}
} catch (error) {
context.failureCount++;
context.results.push({
itemId: item.id,
status: 'ERROR',
error: error.message
});
}
context.processedCount++;
});
// Move to next batch
return {
currentBatchIndex: context.currentBatchIndex + 1,
processedCount: context.processedCount,
successCount: context.successCount,
failureCount: context.failureCount,
results: context.results
};
What It Does:
- Extracts current batch (e.g., items 0-99, then 100-199, etc.)
- Loops through each item in batch using
forEach - Executes business command for each item using
doCmd() - Tracks success/failure for each item
- Updates counters and moves to next batch
Step 4: Update Progress (ScriptTask)β
Task Name: Update Progress
Task Type: ScriptTask
// Calculate progress percentage
var percentComplete = (context.processedCount / context.totalItems) * 100;
console.log('Γ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’Β');
console.log('Progress: ' + percentComplete.toFixed(2) + '% complete');
console.log('Processed: ' + context.processedCount + ' of ' + context.totalItems);
console.log('Success: ' + context.successCount);
console.log('Failed: ' + context.failureCount);
console.log('Γ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’ΒΓ’β’Β');
// Optional: Update progress tracking table
doCmd('UpdateProcessingProgress', {
processId: context.processInstanceId,
percentComplete: percentComplete,
itemsProcessed: context.processedCount,
itemsSucceeded: context.successCount,
itemsFailed: context.failureCount,
currentBatch: context.currentBatchIndex,
totalBatches: context.totalBatches
});
return context;
What It Does:
- Calculates completion percentage
- Logs progress to console
- Optionally updates database for UI tracking
- Passes context to next iteration
Step 5: Generate Report (ScriptTask)β
Task Name: Generate Report
Task Type: ScriptTask
// Create summary report
var report = {
processId: context.processInstanceId,
processDate: context.processDate,
totalItems: context.totalItems,
processedItems: context.processedCount,
successCount: context.successCount,
failureCount: context.failureCount,
successRate: ((context.successCount / context.totalItems) * 100).toFixed(2) + '%',
failureRate: ((context.failureCount / context.totalItems) * 100).toFixed(2) + '%',
duration: calculateDuration(context.startTime, Date.now()),
results: context.results
};
// Store report in database
doCmd('SaveBulkProcessingReport', {
report: report
});
// Set context variable for email task
return {
reportSummary: report,
reportHtml: generateReportHtml(report)
};
function calculateDuration(start, end) {
var durationMs = end - start;
var seconds = Math.floor(durationMs / 1000);
var minutes = Math.floor(seconds / 60);
return minutes + ' minutes ' + (seconds % 60) + ' seconds';
}
function generateReportHtml(report) {
return '<h2>Bulk Processing Report</h2>' +
'<p><strong>Total Items:</strong> ' + report.totalItems + '</p>' +
'<p><strong>Success:</strong> ' + report.successCount + ' (' + report.successRate + ')</p>' +
'<p><strong>Failed:</strong> ' + report.failureCount + ' (' + report.failureRate + ')</p>' +
'<p><strong>Duration:</strong> ' + report.duration + '</p>';
}
What It Does:
- Creates comprehensive summary report
- Calculates success/failure rates
- Saves report to database
- Prepares HTML for email notification
Step 6: Send Email Notification (SendTask)β
Task Name: Send Report Email
Task Type: SendTask (or ScriptTask using SendMailCommand)
// Send email with report
doCmd('SendMail', {
to: ['admin@bank.com', 'operations@bank.com'],
subject: 'Bulk Processing Completed - ' + context.processDate,
body: context.reportHtml,
attachments: [
{
filename: 'bulk-processing-report.json',
content: JSON.stringify(context.reportSummary, null, 2)
}
]
});
return {
emailSent: true,
emailSentAt: new Date().toISOString()
};
What It Does:
- Sends email notification to administrators
- Includes HTML summary in body
- Attaches detailed JSON report
Real-World Example: Collections Platformβ
Scenarioβ
Process 2000 overdue loan accounts and apply penalties.
Process Definitionβ
Process ID: BulkLoanDefaultProcessing
Process Name: Bulk Loan Default Processing
Input Variablesβ
{
"loans": [
{ "loanId": 1, "accountNumber": "LA001", "customerId": 100, "daysOverdue": 35 },
{ "loanId": 2, "accountNumber": "LA002", "customerId": 101, "daysOverdue": 42 },
// ... 1998 more items
],
"processDate": "2026-01-09T10:00:00Z",
"initiatedBy": "admin@bank.com",
"penaltyRate": 0.05, // 5% penalty
"notificationEmails": ["collections@bank.com", "admin@bank.com"]
}
Complete Script for Process Batch Taskβ
// ============================================
// Process Batch - Collections Platform
// ============================================
var startIdx = context.currentBatchIndex * context.batchSize;
var endIdx = Math.min(startIdx + context.batchSize, context.loans.length);
var currentBatch = context.loans.slice(startIdx, endIdx);
console.log('Γ’βΒΆ Processing batch ' + (context.currentBatchIndex + 1) +
' of ' + context.totalBatches);
// Process each loan in batch
currentBatch.forEach(function(loan) {
try {
// Step 1: Check if loan is still active
var loanDetails = doCmd('GetLoanDetails', {
loanId: loan.loanId
});
if (!loanDetails.isSuccessful) {
throw new Error('Failed to retrieve loan details');
}
if (loanDetails.data.status === 'CLOSED') {
context.skippedCount = (context.skippedCount || 0) + 1;
context.results.push({
loanId: loan.loanId,
accountNumber: loan.accountNumber,
status: 'SKIPPED',
reason: 'Loan already closed'
});
context.processedCount++;
return; // Skip this loan
}
// Step 2: Get loan schedules to calculate overdue amount
var schedules = doCmd('GetExistingLoanSchedules', {
accountNumber: loan.accountNumber,
isExport: true // Get all schedules
});
if (!schedules.isSuccessful) {
throw new Error('Failed to retrieve loan schedules');
}
// Step 3: Calculate total overdue amount
var totalOverdue = 0;
var overdueSchedules = [];
schedules.schedules.forEach(function(schedule) {
if (schedule.scheduleState === 'Late') {
totalOverdue += schedule.totalDue;
overdueSchedules.push({
dueDate: schedule.dueDate,
amount: schedule.totalDue
});
}
});
// Step 4: Apply penalty if overdue amount exists
if (totalOverdue > 0) {
var penaltyAmount = totalOverdue * context.penaltyRate;
var penaltyResult = doCmd('ApplyLoanPenalty', {
loanId: loan.loanId,
accountNumber: loan.accountNumber,
amount: penaltyAmount,
reason: 'Bulk default processing - ' + context.processDate,
overdueAmount: totalOverdue,
daysOverdue: loan.daysOverdue
});
if (penaltyResult.isSuccessful) {
context.successCount++;
context.results.push({
loanId: loan.loanId,
accountNumber: loan.accountNumber,
customerId: loan.customerId,
status: 'SUCCESS',
overdueAmount: totalOverdue,
penaltyApplied: penaltyAmount,
daysOverdue: loan.daysOverdue,
overdueSchedulesCount: overdueSchedules.length
});
// Optional: Send SMS notification to customer
doCmd('SendSMS', {
customerId: loan.customerId,
message: 'Your loan ' + loan.accountNumber + ' has a penalty of ' +
penaltyAmount + ' applied due to ' + loan.daysOverdue +
' days overdue.'
});
} else {
context.failureCount++;
context.results.push({
loanId: loan.loanId,
accountNumber: loan.accountNumber,
status: 'FAILED',
error: penaltyResult.message,
overdueAmount: totalOverdue
});
}
} else {
// No overdue amount - skip
context.skippedCount = (context.skippedCount || 0) + 1;
context.results.push({
loanId: loan.loanId,
accountNumber: loan.accountNumber,
status: 'SKIPPED',
reason: 'No overdue amount found'
});
}
context.processedCount++;
} catch (error) {
context.failureCount++;
context.results.push({
loanId: loan.loanId,
accountNumber: loan.accountNumber,
status: 'ERROR',
error: error.message
});
context.processedCount++;
}
});
// Return updated context
return {
currentBatchIndex: context.currentBatchIndex + 1,
processedCount: context.processedCount,
successCount: context.successCount,
failureCount: context.failureCount,
skippedCount: context.skippedCount || 0,
results: context.results
};
Starting the Bulk Processβ
From Backend Serviceβ
Implementation details removed for security.
Contact support for implementation guidance.
From API Controllerβ
Implementation details removed for security.
Contact support for implementation guidance.
Monitoring Progressβ
Check Progress from APIβ
Implementation details removed for security.
Contact support for implementation guidance.
Progress Tracking UI (Example)β
<!-- Progress Bar Component -->
<div class="bulk-process-progress">
<h3>Bulk Processing: Loan Default Penalties</h3>
<div class="progress-bar">
<div class="progress-fill" style="width: {{ percentComplete }}%">
{{ percentComplete }}%
</div>
</div>
<div class="progress-stats">
<div class="stat">
<span class="label">Total Items:</span>
<span class="value">{{ totalItems }}</span>
</div>
<div class="stat">
<span class="label">Processed:</span>
<span class="value">{{ processedItems }}</span>
</div>
<div class="stat success">
<span class="label">Success:</span>
<span class="value">{{ successCount }}</span>
</div>
<div class="stat failure">
<span class="label">Failed:</span>
<span class="value">{{ failureCount }}</span>
</div>
<div class="stat">
<span class="label">Batch:</span>
<span class="value">{{ currentBatch }} / {{ totalBatches }}</span>
</div>
</div>
</div>
<script>
// Poll for progress updates
setInterval(async () => {
const response = await fetch(`/api/collections/bulk-process/${processId}/status`);
const status = await response.json();
// Update UI
updateProgressUI(status);
// Stop polling when complete
if (status.isComplete) {
clearInterval(this);
showCompletionNotification(status);
}
}, 5000); // Check every 5 seconds
</script>
Error Handling Strategiesβ
Strategy 1: Continue on Error (Recommended)β
// Process all items, collect failures for review
currentBatch.forEach(function(item) {
try {
var result = doCmd('ProcessItem', { itemId: item.id });
if (result.isSuccessful) {
context.successCount++;
} else {
context.failureCount++;
context.results.push({
itemId: item.id,
status: 'FAILED',
error: result.message
});
}
} catch (error) {
context.failureCount++;
context.results.push({
itemId: item.id,
status: 'ERROR',
error: error.message
});
}
context.processedCount++;
});
Use When: You want to process as many items as possible and review failures later.
Strategy 2: Stop on First Errorβ
// Stop entire process if any item fails
currentBatch.forEach(function(item) {
var result = doCmd('ProcessItem', { itemId: item.id });
if (!result.isSuccessful) {
throw new Error('Processing failed at item ' + item.id + ': ' + result.message);
}
context.successCount++;
context.processedCount++;
});
Use When: Operations must be atomic (all or nothing).
Strategy 3: Stop if Error Rate Exceeds Thresholdβ
// Monitor error rate and stop if too high
var errorRate = context.failureCount / context.processedCount;
if (errorRate > 0.10) { // Stop if >10% failure rate
throw new Error(
'Error rate exceeded 10% threshold. ' +
'Failed: ' + context.failureCount + ' of ' + context.processedCount + '. ' +
'Stopping bulk process for investigation.'
);
}
// Continue processing...
Use When: High failure rate indicates systemic issue that needs investigation.
Performance Optimizationβ
1. Batch Size Selectionβ
| Collection Size | Recommended Batch Size | Rationale |
|---|---|---|
| < 100 items | Process all at once | Minimal overhead |
| 100 - 1,000 | 50 - 100 items | Balance memory and progress |
| 1,000 - 10,000 | 100 - 200 items | Frequent progress updates |
| > 10,000 | 200 - 500 items | Reduce loop overhead |
2. Avoid N+1 Query Problemβ
// Γ’ΒΕ BAD: Query inside loop (2000 queries)
items.forEach(function(item) {
var details = doCmd('GetItemDetails', { id: item.id });
processItem(details);
});
// Γ’Εβ¦ GOOD: Bulk fetch before loop (1 query)
var allIds = items.map(function(i) { return i.id; });
var detailsMap = doCmd('GetItemDetailsBulk', { ids: allIds });
items.forEach(function(item) {
var details = detailsMap[item.id];
processItem(details);
});
3. Use Caching for Lookup Dataβ
// Cache reference data that doesn't change
$.cache.set('penaltyRate', 0.05, { ttl: 3600 });
$.cache.set('currencyRates', getCurrencyRates(), { ttl: 3600 });
// In processing loop
currentBatch.forEach(function(item) {
var penaltyRate = $.cache.get('penaltyRate');
var rate = $.cache.get('currencyRates')[item.currency];
// Process with cached data
var penalty = item.amount * penaltyRate * rate;
});
4. Store Results Incrementallyβ
// Γ’ΒΕ BAD: Keep all results in memory
context.results.push(result); // Can cause out-of-memory for 10,000+ items
// Γ’Εβ¦ GOOD: Store results in database immediately
doCmd('LogBulkProcessingResult', {
processId: context.processInstanceId,
itemId: item.id,
result: result,
timestamp: new Date().toISOString()
});
Best Practices Summaryβ
Γ’Εβ¦ DOβ
- Use single workflow for bulk operations
- Process in batches (50-200 items per batch)
- Log progress after each batch
- Continue on error and collect failures
- Pre-fetch data to avoid N+1 queries
- Store results incrementally for large collections
- Use caching for lookup data
- Set realistic timeouts for long-running processes
- Send notifications when complete
- Track metrics (duration, success rate, error patterns)
Γ’ΒΕ DON'Tβ
- Create separate workflow per item
- Process all items without batching
- Keep all results in memory
- Abort on first error (unless atomicity required)
- Make database calls inside loops
- Ignore error patterns
- Run without progress tracking
- Process without timeout limits
Testing Your Bulk Processβ
Unit Test Exampleβ
Implementation details removed for security.
Contact support for implementation guidance.
Integration Test with Smaller Datasetβ
Implementation details removed for security.
Contact support for implementation guidance.
Conclusionβ
For processing large collections (hundreds to thousands of items) in BPM workflows:
- Γ’Εβ¦ Create ONE workflow instance - Not one per item
- Γ’Εβ¦ Use batching - Process 50-200 items at a time
- Γ’Εβ¦ Implement progress tracking - Update after each batch
- Γ’Εβ¦ Use Exclusive Gateway loop - Continue until all batches processed
- Γ’Εβ¦ Handle errors gracefully - Collect failures, don't stop process
- Γ’Εβ¦ Optimize database access - Avoid N+1 queries
- Γ’Εβ¦ Send notifications - Email report when complete
This pattern provides:
- Scalability: Handles thousands of items efficiently
- Reliability: Continues processing despite individual failures
- Visibility: Real-time progress tracking
- Maintainability: Single workflow definition, easy to modify
- Performance: Optimized database access and memory usage
Happy bulk processing! Γ°ΕΈΕ‘β¬