POST /api/v1/pipeline to register the pipeline without writing any JSON by hand.
How It Works
/api/v1/pipeline.
Endpoint
| Parameter | Type | Required | Description |
|---|---|---|---|
file | form-data (file) | Yes | The file to analyze |
pipeline | query | No | Pipeline name. If omitted, derived from the filename (lowercased, non-alphanumeric characters replaced with _) |
delimiter | query | No | Column delimiter for delimited files. Defaults to , |
header | query | No | Whether the file has a header row. Defaults to false |
x-api-key | header | No | API key (required if useApiKeys: true) |
Schema Rules by File Type
| File type | Schema | Default destination |
|---|---|---|
| CSV / delimited | AI infers field names and types from file content | PostgreSQL (usePostgres: true) |
JSON (.json) | Single field: _json (type string) | MongoDB (useMongoDB: true) |
XML (.xml) | Single field: _xml (type string) | MongoDB (useMongoDB: true) |
boolean, int, bigint, float, double, string, date, timestamp
Example: CSV File
DATABASE_NAME, SCHEMA_NAME, and TABLE_NAME with real values, then register the pipeline:
Example: JSON File
Configuration
AI schema generation is disabled by default. To enable it:application.yaml (or docker/config/application.yaml for Docker deployments):
secret/<aiSecretName>):
| Key | Description |
|---|---|
endpoint | The AI provider API URL |
model | The model name to use |
apiKey | The API key for authentication |
ai.enabled: true and the secret is missing or any key is absent, startup will fail with a descriptive error.
Supported Providers
| Provider | provider value | Auth header |
|---|---|---|
| Anthropic Claude | anthropic | x-api-key + anthropic-version: 2023-06-01 |
| OpenAI | openai | Authorization: Bearer |
provider will cause startup to fail with an unsupported provider error.