Pipeline SDK
SDK reference for Pipeline
Pipeline SDK
A data ingestion and transformation service that processes streaming events with SQL and delivers them to object storage in analytics-ready formats.
Import
import { $ } from 'sdk.do'Operations
create
Create a new data pipeline.
const result = await $.Pipeline.create({
// parameters
})delete
Permanently remove the pipeline.
const result = await $.Pipeline.delete({
// parameters
})start
Start processing events through the pipeline.
const result = await $.Pipeline.start({
// parameters
})pause
Temporarily stop processing events.
const result = await $.Pipeline.pause({
// parameters
})resume
Resume a paused pipeline.
const result = await $.Pipeline.resume({
// parameters
})stop
Stop the pipeline and flush remaining events.
const result = await $.Pipeline.stop({
// parameters
})ingest
Send an event to the pipeline for processing.
const result = await $.Pipeline.ingest({
// parameters
})transform
Update the SQL transformation query.
const result = await $.Pipeline.transform({
// parameters
})monitor
Get pipeline metrics (throughput, errors, latency).
const result = await $.Pipeline.monitor({
// parameters
})logs
Retrieve pipeline execution logs.
const result = await $.Pipeline.logs({
// parameters
})Events
created
Triggered when a new pipeline is created.
import { on, send } from 'sdk.do'
// Subscribe to event
on.Pipeline.created(async (data) => {
console.log('Event received:', data)
})
// Emit event
await send.Pipeline.created({ /* data */ })deleted
Triggered when pipeline is removed.
import { on, send } from 'sdk.do'
// Subscribe to event
on.Pipeline.deleted(async (data) => {
console.log('Event received:', data)
})
// Emit event
await send.Pipeline.deleted({ /* data */ })started
Triggered when pipeline begins processing.
import { on, send } from 'sdk.do'
// Subscribe to event
on.Pipeline.started(async (data) => {
console.log('Event received:', data)
})
// Emit event
await send.Pipeline.started({ /* data */ })paused
Triggered when pipeline is temporarily stopped.
import { on, send } from 'sdk.do'
// Subscribe to event
on.Pipeline.paused(async (data) => {
console.log('Event received:', data)
})
// Emit event
await send.Pipeline.paused({ /* data */ })resumed
Triggered when paused pipeline continues.
import { on, send } from 'sdk.do'
// Subscribe to event
on.Pipeline.resumed(async (data) => {
console.log('Event received:', data)
})
// Emit event
await send.Pipeline.resumed({ /* data */ })stopped
Triggered when pipeline is stopped.
import { on, send } from 'sdk.do'
// Subscribe to event
on.Pipeline.stopped(async (data) => {
console.log('Event received:', data)
})
// Emit event
await send.Pipeline.stopped({ /* data */ })ingested
Triggered when event is received by pipeline.
import { on, send } from 'sdk.do'
// Subscribe to event
on.Pipeline.ingested(async (data) => {
console.log('Event received:', data)
})
// Emit event
await send.Pipeline.ingested({ /* data */ })transformed
Triggered when event is processed by SQL transform.
import { on, send } from 'sdk.do'
// Subscribe to event
on.Pipeline.transformed(async (data) => {
console.log('Event received:', data)
})
// Emit event
await send.Pipeline.transformed({ /* data */ })written
Triggered when batch is written to R2.
import { on, send } from 'sdk.do'
// Subscribe to event
on.Pipeline.written(async (data) => {
console.log('Event received:', data)
})
// Emit event
await send.Pipeline.written({ /* data */ })failed
Triggered when pipeline encounters error.
import { on, send } from 'sdk.do'
// Subscribe to event
on.Pipeline.failed(async (data) => {
console.log('Event received:', data)
})
// Emit event
await send.Pipeline.failed({ /* data */ })