Skip to main content
Query job processing status by pipeline token or pipeline name.

Get Status by Pipeline Token

GET /api/v1/pipeline/status?pipelinetoken={token}
Parameters:
ParameterTypeRequiredDescription
pipelinetokenqueryYes*Pipeline token from upload response
Example:
curl "http://localhost:8080/api/v1/pipeline/status?pipelinetoken=pt-abc12345-..."
Response: 200 OK - an array of status entries, one per processing stage:
[
  {
    "id": 1,
    "dateTime": "2026-03-15T10:00:00Z",
    "pipeline": "sales_data",
    "processName": "StreamNotifier",
    "publisherToken": null,
    "pipelineToken": "pt-abc12345-...",
    "filename": "sales_data",
    "state": "begin",
    "code": "begin",
    "description": "Process started",
    "epoch": 1710500400000
  },
  {
    "id": 2,
    "dateTime": "2026-03-15T10:00:03Z",
    "pipeline": "sales_data",
    "processName": "PostgresLoader",
    "publisherToken": null,
    "pipelineToken": "pt-abc12345-...",
    "filename": "sales_data",
    "state": "end",
    "code": "end",
    "description": "Process completed",
    "epoch": 1710500403000
  }
]

Get Status by Pipeline Name

GET /api/v1/pipeline/status?pipelinename={name}&page={page}
Parameters:
ParameterTypeRequiredDescription
pipelinenamequeryYes*Pipeline name
pagequeryNoPage number (default: 1)
Example:
curl "http://localhost:8080/api/v1/pipeline/status?pipelinename=sales_data&page=1"
Response: 200 OK - an array of job summaries:
[
  {
    "createdAtTimestamp": "2026-03-15T10:00:00Z",
    "createdAt": 1710500400000,
    "updatedAt": 1710500403000,
    "pipeline": "sales_data",
    "pipelineToken": "pt-abc12345-...",
    "process": "PostgresLoader",
    "startTime": "2026-03-15T10:00:00Z",
    "endTime": "2026-03-15T10:00:03Z",
    "totalTime": "3s",
    "status": "end"
  }
]

*Use either pipelinetoken or pipelinename, not both.

Status Fields

Each status entry (PipelineStatus) contains:
FieldDescription
idEntry index (internal)
dateTimeHuman-readable timestamp
pipelinePipeline name
processNameProcessing stage name (see below)
publisherTokenPublisher identifier (if provided on upload)
pipelineTokenPipeline job token
filenameSource filename
stateStage state: begin, processing, end, or error
codeSame as state
descriptionDetail message
epochUnix epoch milliseconds

State Values

StateDescription
beginProcessing stage started
processingIn progress with detail message
endProcessing stage completed
errorProcessing stage failed

Process Names

ProcessDescription
FileNotifierFile intake from MinIO bucket
StreamNotifierDirect upload intake
DataQualityData quality validation
TransformationData transformation
JobRunnerDestination orchestration
PostgresLoaderPostgreSQL loading
MongoDBLoaderMongoDB loading
SparkObjectStoreLoaderObject store writing
KafkaLoaderKafka producing
ActiveMQLoaderActiveMQ queue writing
RestEndpointRunnerREST endpoint posting