v1 → v2 Migration Guide
Why the change
MapPrism v1 sent job_type webhooks directly to n8n — one workflow per job type. That became rigid fast. v2 adds Ordo as an orchestration layer between MapPrism and n8n.
Ordo owns job state. n8n workers read pending steps from PostgreSQL and run them. The frontend doesn't talk to n8n at all.
From a product perspective, nothing changes:
- Upload to MinIO
- POST to start processing
- Poll for status
- Read output from MinIO
The change is in steps 2 and 3.
What changed: API
v1 API
POST /webhook/n8n
GET /progress/{job_id}Submit a job (v1):
json
POST /webhook/n8n
{
"job_type": "POTREE_CONVERSION",
"payload": {
"input": "development/pineydam/mbes.las",
"output": "development/pineydam/mbes"
}
}Poll status (v1):
json
GET /progress/{job_id}
{
"job_type": "POTREE_CONVERSION",
"status": "completed",
"progress": 100,
"output": "development/pineydam/mbes"
}v2 API
POST /ordo/jobs
GET /ordo/jobs/:id
GET /ordo/jobs
GET /healthSubmit a job (v2):
json
POST /ordo/jobs
{
"recipe": { ... },
"inputs": {
"job:input_las": {
"type": "las",
"uri": "development/pineydam/mbes.las",
"hash": "abc123"
}
},
"params": { ... },
"outputs": { ... }
}Response: { "id": 4 }
Poll status (v2):
json
GET /ordo/jobs/4
{
"job": { "id": 4, "status": "success", ... },
"progress": { "percentage": 1, "completed_steps": 2, "total_steps": 2 },
"steps": [ ... ],
"artifacts": [ ... ],
"outputs": [ ... ]
}v1 job_type → v2 recipe mapping
v1 job_type | v2 recipe name | Notes |
|---|---|---|
POTREE_CONVERSION | potree-conversion | See build-potree |
ENTWINE_CONVERSION | ept-conversion | See build-ept |
COPC_CONVERSION | copc-conversion | See build-copc |
POTREE_CONVERSION_AND_REPROJECTION | reproject-potree | See reproject-and-build-potree |
ENTWINE_CONVERSION_AND_REPROJECTION | reproject-ept | See reproject-and-build-ept |
DATASET_INFO | (not a standalone recipe) | Dataset info now runs in parallel within all recipes automatically |
Submitting the equivalent of a v1 POTREE_CONVERSION
v1:
json
POST /webhook/n8n
{
"job_type": "POTREE_CONVERSION",
"payload": {
"input": "development/pineydam/mbes.las",
"output": "development/pineydam/mbes"
}
}v2:
bash
curl -X POST https://dev.mapprism.com/ordo/jobs \
-H "Authorization: Bearer <API_TOKEN>" \
-H "Content-Type: application/json" \
-d '{
"recipe": {
"name": "potree-conversion",
"version": "3.0.0",
"definition": {
"recipe": [
{
"id": "build_potree",
"type": "BUILD_POTREE",
"inputs": { "input_las": "job:input_las" },
"outputs": { "output_potree": "step:build_potree.output_potree" }
},
{
"id": "dataset_info",
"type": "DATASET_INFO",
"inputs": { "input_las": "job:input_las" },
"outputs": { "metadata": "step:dataset_info.metadata" }
}
],
"on_exit": {
"id": "call_webhook",
"type": "CALL_WEBHOOK",
"inputs": { "waits_for": "step:dataset_info.metadata" },
"param_keys": ["webhook_url"]
}
}
},
"inputs": {
"job:input_las": {
"type": "las",
"uri": "development/pineydam/mbes.las",
"hash": "abc123"
}
},
"params": {
"call_webhook": {
"webhook_url": "https://myapp.example.com/hooks/done"
}
},
"outputs": {
"step:build_potree.output_potree": {
"path": "development/pineydam/mbes"
}
}
}'Submitting the equivalent of a v1 POTREE_CONVERSION_AND_REPROJECTION
v1:
json
POST /webhook/n8n
{
"job_type": "POTREE_CONVERSION_AND_REPROJECTION",
"payload": {
"input": "development/pineydam/mbes.las",
"output": "development/pineydam/mbes",
"source_epsg": "EPSG:2271",
"target_epsg": "EPSG:3857"
}
}v2: Use the reproject-potree recipe and add source_epsg/target_epsg under params.reproject. See reproject-and-build-potree.
What's new in v2
Dataset info is automatic. All preset recipes now run DATASET_INFO in parallel with the main conversion step. You no longer need to submit a separate job to get metadata.
Webhook is built in. All preset recipes include CALL_WEBHOOK in on_exit. Pass webhook_url under params.call_webhook to receive a notification when the job completes.
Full job state on poll. The GET /ordo/jobs/:id response includes step-level status, per-step timing, all artifacts, and output destinations — much richer than the v1 progress endpoint.
Composable pipelines. You can write custom recipes that combine any registered executors in any order. See Writing a Recipe.
Base URL change
| Environment | v1 | v2 |
|---|---|---|
| Dev | https://dev.mapprism.com/webhook/n8n | https://dev.mapprism.com/ordo/jobs |
Auth change
v1: auth was embedded in the webhook endpoint configuration.
v2: all requests require Authorization: Bearer <API_TOKEN> header (except GET /health).