Standard batch import API
The standard batch import API is useful for automated uploads, and development of scripts which automatically upload batches of files after extracting data from the external system.
There are three steps:
- Create a new batch job
- Upload the files
- Schedule the import job to run it.
See the example for a script which uses curl
to run a standard data import batch.
You’ll need an API key, which you can generate in the data import UI:
After creation, API keys are managed in System management:
Scroll down to view the current API keys.
Create a new batch job
Once the control file is uploaded, create a new import batch.
/api/haplo-data-import-standard-api/batch
(POST only)This expects a normal application/x-www-form-urlencoded
request body, with parameters:
comment |
A short comment describing the batch import. |
kind |
The kind identifier of the API, as shown on the relevant API page within the application |
On success, a 200
status code is returned with the identifier of the batch as the request body.
Upload one or more files
Using the batch identifier, upload one or more files to the batch by repeated calls to this endpoint.
/api/haplo-data-import-standard-api/file
(POST only)This expects a multipart/form-data
request body, with parameters:
batch |
The batch identifier, as returned by the batch creation endpoint. |
name |
The name of the file, matching a name specified in the control file, so the data import framework knows how to read it. |
file |
The data file. |
On success, a 200
status code is returned with the digest of the data file as the request body.
Schedule the import job
After all the files are uploaded, schedule the job to run:
/api/haplo-data-import-standard-api/schedule
(POST only)This expects a normal application/x-www-form-urlencoded
request body, with parameters:
batch |
The batch identifier, as returned by the batch creation endpoint. |
mode |
(optional) If set to ‘dry-run’, run the import in dry run mode. |
On success, a 200
status code is returned with SCHEDULED
as the request body.
The batch will be run as soon as possible, and the log visible in the admin UI: