Skip to content

Commit b35860e

Browse files
committed
feature: Added Safe Bulk Record Update with Progress Tracking snippet
1 parent 7c55c08 commit b35860e

File tree

2 files changed

+154
-0
lines changed

2 files changed

+154
-0
lines changed
Lines changed: 94 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,94 @@
1+
# Safe Bulk Record Update with Logging
2+
3+
## Overview
4+
Efficiently update multiple records in batch with error handling, progress tracking, and logging to prevent timeouts and data loss.
5+
6+
## What It Does
7+
- Updates records in configurable batch sizes
8+
- Logs progress for monitoring
9+
- Handles individual record errors without stopping batch
10+
- Prevents script timeout with batch processing
11+
- Tracks success/failure counts
12+
- Logs detailed error information
13+
14+
## Use Cases
15+
- Bulk data migrations
16+
- Mass field updates after deployment
17+
- Scheduled bulk corrections
18+
- Data cleanup operations
19+
- Batch status updates across records
20+
21+
## Files
22+
- `bulk_update_with_progress.js` - Background Script for safe bulk updates
23+
24+
## How to Use
25+
26+
### Option 1: Run as Background Script
27+
1. Go to **System Diagnostics > Script Background**
28+
2. Copy code from `bulk_update_with_progress.js`
29+
3. Modify the table name and query filter
30+
4. Execute and monitor logs
31+
32+
### Option 2: Create as Scheduled Job
33+
1. Go to **System Scheduler > Scheduled Jobs**
34+
2. Create new job with the script code
35+
3. Schedule for off-peak hours
36+
4. Logs will be available in System Logs
37+
38+
## Example Usage
39+
```javascript
40+
// Customize these variables:
41+
var TABLE = 'incident';
42+
var FILTER = "priority=1^state=2"; // Your query condition
43+
var BATCH_SIZE = 100;
44+
var FIELD_TO_UPDATE = 'assignment_group'; // Field to update
45+
var NEW_VALUE = '123456789abc'; // New value
46+
47+
// Run the script - it handles everything else
48+
```
49+
50+
## Key Features
51+
- **Batch Processing**: Prevents timeout by processing records in chunks
52+
- **Error Resilience**: Continues on error, logs details
53+
- **Progress Tracking**: Logs every N records updated
54+
- **Flexible**: Works with any table and field
55+
- **Safe**: Won't crash on individual record failures
56+
- **Auditable**: Detailed logging of all operations
57+
58+
## Output Examples
59+
```
60+
[Bulk Update Started] Processing incidents with filter: priority=1
61+
[Progress] Updated 100 records successfully (5 errors)
62+
[Progress] Updated 200 records successfully (8 errors)
63+
[Bulk Update Complete] Total: 250 | Success: 242 | Errors: 8
64+
[Failed Records] 7af24b9c: User already has assignment
65+
[Failed Records] 8bd35c8d: Invalid assignment group
66+
```
67+
68+
## Performance Notes
69+
- Batch size of 100 is optimal for most tables
70+
- Adjust batch size based on available resources
71+
- Run during maintenance windows for large updates
72+
- Monitor system logs during execution
73+
74+
## Customization
75+
```javascript
76+
// Change batch size for your table size
77+
var BATCH_SIZE = 50; // For smaller batches
78+
var BATCH_SIZE = 200; // For larger tables
79+
80+
// Different field update logic
81+
record.setValue(FIELD_TO_UPDATE, NEW_VALUE);
82+
// Or use gs.getProperty() for configuration
83+
```
84+
85+
## Requirements
86+
- ServiceNow instance
87+
- Access to Background Scripts or Scheduled Jobs
88+
- Write access to target table
89+
- Appropriate table and field permissions
90+
91+
## Related APIs
92+
- [GlideRecord Query API](https://docs.servicenow.com/bundle/sandiego-application-development/page/app-store/dev_apps/concept/c_UsingGlideRecord.html)
93+
- [GlideSystem Logging](https://docs.servicenow.com/bundle/sandiego-application-development/page/app-store/dev_apps/concept/c_SystemLog.html)
94+
- [Best Practices for Bulk Operations](https://docs.servicenow.com/bundle/sandiego-application-development/page/app-store/dev_apps/concept/c_BulkOperations.html)
Lines changed: 60 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,60 @@
1+
// Background Script: Safe Bulk Record Update with Progress Tracking
2+
// Purpose: Update multiple records safely with batch processing and error handling
3+
4+
var TABLE = 'incident'; // Change to your table
5+
var FILTER = "priority=1"; // Add your filter conditions
6+
var BATCH_SIZE = 100;
7+
var FIELD_TO_UPDATE = 'state'; // Field to update
8+
var NEW_VALUE = '1'; // Value to set
9+
10+
var successCount = 0;
11+
var errorCount = 0;
12+
var totalProcessed = 0;
13+
14+
gs.log('[Bulk Update Started] Table: ' + TABLE + ' | Filter: ' + FILTER, 'BulkUpdate');
15+
16+
try {
17+
var gr = new GlideRecord(TABLE);
18+
gr.addEncodedQuery(FILTER);
19+
gr.query();
20+
21+
var recordsToProcess = [];
22+
while (gr.next()) {
23+
recordsToProcess.push(gr.getUniqueValue());
24+
25+
// Process in batches to prevent timeout
26+
if (recordsToProcess.length === BATCH_SIZE) {
27+
processBatch(recordsToProcess);
28+
recordsToProcess = [];
29+
}
30+
}
31+
32+
// Process remaining records
33+
if (recordsToProcess.length > 0) {
34+
processBatch(recordsToProcess);
35+
}
36+
37+
gs.log('[Bulk Update Complete] Total: ' + totalProcessed + ' | Success: ' + successCount + ' | Errors: ' + errorCount, 'BulkUpdate');
38+
39+
} catch (e) {
40+
gs.error('[Bulk Update Error] ' + e.toString(), 'BulkUpdate');
41+
}
42+
43+
function processBatch(recordIds) {
44+
for (var i = 0; i < recordIds.length; i++) {
45+
try {
46+
var record = new GlideRecord(TABLE);
47+
record.get(recordIds[i]);
48+
record.setValue(FIELD_TO_UPDATE, NEW_VALUE);
49+
record.update();
50+
successCount++;
51+
} catch (error) {
52+
errorCount++;
53+
gs.log('[Failed Record] ' + recordIds[i] + ': ' + error.toString(), 'BulkUpdate');
54+
}
55+
totalProcessed++;
56+
}
57+
58+
// Log progress
59+
gs.log('[Progress] Updated ' + totalProcessed + ' records | Success: ' + successCount + ' | Errors: ' + errorCount, 'BulkUpdate');
60+
}

0 commit comments

Comments
 (0)