Example script

This example script uses the batch import API to create and run a batch data import job. It uses curl on a UNIX-like operating system.

Create an API key with:

Your Name » Data import » Batch import » Create API key…

then replace the values of API_KEY and SERVER in the script with the key and hostname of your application.

#!/bin/sh
set -e

API_KEY=API_KEY_GOES_HERE
FILE_DIRECTORY=.
SERVER=research.example.ac.uk

# Upload the control file
CONTROL_DIGEST=`curl -s -X POST -F "comment=Example control" -F file=@${FILE_DIRECTORY}/control.json --user haplo:${API_KEY} https://${SERVER}/api/haplo-data-import-batch/control`
echo "CONTROL DIGEST: " $CONTROL_DIGEST

# Create a new batch
BATCH_ID=`curl -s -X POST -d "comment=Example import" -d "control=$CONTROL_DIGEST" --user haplo:${API_KEY} https://${SERVER}/api/haplo-data-import-batch/batch`
echo "BATCH ID: " $BATCH_ID

# Upload files to the batch
curl -X POST -F "batch=$BATCH_ID" -F name=data0 -F file=@${FILE_DIRECTORY}/data.json --user haplo:${API_KEY} https://${SERVER}/api/haplo-data-import-batch/file
echo
# Repeat for each file with different name= parameters

# This does a dry run only, remove '-d mode=dry-run' to import the data
curl -X POST -d "batch=$BATCH_ID" -d mode=dry-run --user haplo:${API_KEY} https://${SERVER}/api/haplo-data-import-batch/schedule
echo