The manual action is this:
POST /rest/v1/employees/{id}/actions/move-to-region
{
"region": 123
}
{id} = the employee.id unique identifier of the employee
region = the region.id unique identifier of the region
But you can create a /batch/file call that does this in bulk, up to 50 at a time. For 18k records let’s say, that would be 360 calls.
One of the tricks we use at TrackTik is to create a CSV with rows that are the list of employees to move, with columns for the action, resource, actionName, data/region, lookup/id values. Then you can use a CSV->JSON converter to build the array of operations.
All you need is one region ID for the destination, and do any kind of API calls you need to generate the list of employee IDs, and make a CSV like this (notice how column D is the only variable data column, the list of employee.id values):
Save that as CSV which produces:
action,resource,actionName,lookup/id,data/region
EXECUTE,employees,move-to-region,100,12
EXECUTE,employees,move-to-region,101,12
EXECUTE,employees,move-to-region,102,12
EXECUTE,employees,move-to-region,103,12
EXECUTE,employees,move-to-region,104,12
And you can convert this into a JSON array with any tool from a software framework, or third party tool like https://www.convertcsv.com/csv-to-json.htm
[
{
"action": "EXECUTE",
"resource": "employees",
"actionName": "move-to-region",
"lookup": {
"id": 100
},
"data": {
"region": 12
}
},
{
"action": "EXECUTE",
"resource": "employees",
"actionName": "move-to-region",
"lookup": {
"id": 101
},
"data": {
"region": 12
}
},
{
"action": "EXECUTE",
"resource": "employees",
"actionName": "move-to-region",
"lookup": {
"id": 102
},
"data": {
"region": 12
}
},
{
"action": "EXECUTE",
"resource": "employees",
"actionName": "move-to-region",
"lookup": {
"id": 103
},
"data": {
"region": 12
}
},
{
"action": "EXECUTE",
"resource": "employees",
"actionName": "move-to-region",
"lookup": {
"id": 104
},
"data": {
"region": 12
}
}
]
Add a closing } to the bottom, go to the top of this array just to the left of the [, prepend with:
{
"onFailure": "CONTINUE",
"operations":
And that’s your full payload. Have up to 50 employees per payload and run it until you’ve parsed through the 18000 employees.
Put all together, a sample /batch/file payload will look like this:
{
"onFailure": "CONTINUE",
"operations": [
{
"action": "EXECUTE",
"resource": "employees",
"actionName": "move-to-region",
"data": {
"region" : 12
},
"lookup": {
"id": 101
}
},
{
"action": "EXECUTE",
"resource": "employees",
"actionName": "move-to-region",
"data": {
"region" : 12
},
"lookup": {
"id": 102
}
},
etc.
]
A happy path response for each POST to /batch/file will look like the following:
{
"data": {
"hasErrors": false,
"success": [
{
"resource": "employees",
"resourceId": "100",
"data": null
},
{
"resource": "employees",
"resourceId": "101",
"data": null
},
Etc.
],
"processErrors": [],
"validationErrors": []
}
}