Skip to main content

Overview

The @filefeed/sdk package provides a type-safe client for the FileFeed API.
  • Authentication via API key
  • Resources: Pipeline Runs, Clients, Schemas, Pipelines, Webhooks, Outbound Uploads
  • Pagination helpers and typed responses

API Reference

Explore endpoints with the API Playground.

Syncing Data

Fetch processed data and acknowledge runs.

Handling Errors

Retry patterns and structured error handling.

Installation

npm install @filefeed/sdk

Requirements

  • Node.js >= 18
  • TypeScript >= 5 (for TS projects)

Initialize the client

import FileFeed from '@filefeed/sdk';

const filefeed = new FileFeed({ apiKey: process.env.FILEFEED_API_KEY! });
// Example usage
const runs = await filefeed.pipelineRuns.list({ status: 'completed', limit: 50 });

Next steps

Resources and methods

For endpoint details and payload schemas, see the API Reference.
Manage SFTP clients and connection details.
MethodSignatureDescription
listlist()List clients.
retrieveretrieve(id)Get a client.
createcreate(params)Create a client with credentials.
updateupdate(id, params)Update client details.
removeremove(id)Delete a client.
testConnectiontestConnection(id)Test SFTP connectivity (returns success/message).
Common workflows
  • Provision a client with SFTP credentials
  • Rotate credentials, manage allowlists
  • Verify connectivity before onboarding
const client = await filefeed.clients.create({ name: 'Acme' });
const ok = await filefeed.clients.testConnection(client.id);
Define and validate your target data model.
MethodSignatureDescription
listlist()List schemas.
retrieveretrieve(id)Get a schema.
createcreate(params)Create a schema (fields, validation).
updateupdate(id, params)Update schema details.
removeremove(id)Delete a schema.
validatevalidate({ schemaId, data })Validate a payload against a schema.
Common workflows
  • Define required fields and validation rules
  • Validate a sample payload pre-ingestion
const schema = await filefeed.schemas.create({ name: 'Employees', fields: [
  { name: 'email', type: 'string', required: true },
  { name: 'name', type: 'string', required: true }
] });
const result = await filefeed.schemas.validate({ schemaId: schema.id, data: { email: 'a@b.com', name: 'Ada' } });
Connect clients to schemas and define mappings/transforms.
MethodSignatureDescription
listlist({ clientId?, clientName? })List pipelines.
retrieveretrieve(id)Get a pipeline.
createcreate(params)Create a pipeline (mappings/transforms).
updateupdate(id, params)Update pipeline configuration.
removeremove(id)Delete a pipeline.
toggleActivetoggleActive(id)Enable/disable a pipeline.
Common workflows
  • Create a pipeline for a client + schema
  • Adjust mappings, then toggle active for go-live
const pipeline = await filefeed.pipelines.create({ name: 'Employees', clientId, schemaId, mappings: {} });
await filefeed.pipelines.toggleActive(pipeline.id);
Manage processing jobs and retrieve processed data.
MethodSignatureDescription
listlist({ status?, clientId?, pipelineId?, pipelineName?, page?, limit? })Paginated runs. Filter by status/client/pipeline/ids.
retrieveretrieve(id)Get a single pipeline run.
getDatagetData({ pipelineRunId, offset?, limit? })Paginated processed rows (offset-based, up to 1000 per page).
ackack({ pipelineRunId })Mark run as processed (idempotent).
reprocessreprocess({ pipelineRunId })Re-run processing for a given run.
getOriginalFileUrlgetOriginalFileUrl({ pipelineRunId, expiresIn? })Presigned URL to the original file.
getProcessedFileUrlgetProcessedFileUrl({ pipelineRunId, expiresIn? })Presigned URL to the processed file.
Common workflows
  • Fetch completed runs, paginate data, then acknowledge the run
  • Download original or processed file for audit trails
  • Reprocess failed runs after fixing mapping or schema
// Quick example
const runs = await filefeed.pipelineRuns.list({ status: 'completed', limit: 25 });
const page = await filefeed.pipelineRuns.getData({ pipelineRunId: runs.data[0].id, limit: 1000 });
Receive signed notifications for pipeline events.
MethodSignatureDescription
listlist()List webhooks.
retrieveretrieve(id)Get a webhook.
createcreate(params)Create a webhook (URL, event type, secret).
updateupdate(id, params)Update webhook configuration.
removeremove(id)Delete a webhook.
listDeliverieslistDeliveries(params)Inspect delivery attempts and status codes.
Common workflows
  • Create a webhook for pipeline events
  • Monitor delivery health and retry on failure
const hook = await filefeed.webhooks.create({ name: 'Pipeline events', url: 'https://example.com/webhooks/filefeed', eventType: 'GENERAL' });
const deliveries = await filefeed.webhooks.listDeliveries({ webhookId: hook.id, page: 1, limit: 50 });
Push JSON data into outbound pipelines via multipart uploads.
MethodSignatureDescription
initUploadinitUpload(params)Create an upload session.
uploadPartuploadPart(uploadId, partNumber, params)Upload one JSON array part.
completeUploadcompleteUpload(uploadId, params)Finalize and trigger processing.
abortUploadabortUpload(uploadId)Cancel and cleanup parts.
getUploadStatusgetUploadStatus(uploadId)Check session progress.
uploadJsonuploadJson(params)Convenience: chunk, upload, and complete in one call.
Common workflows
  • Push data from your backend into a pipeline
  • Chunk large datasets and upload in parts
  • Use uploadJson() for simple one-shot uploads
const result = await filefeed.outbound.uploadJson({
  clientName: 'acme-corp',
  pipelineName: 'employee-sync',
  data: [{ remoteId: 'E001', firstName: 'Alice', lastName: 'Smith' }],
});
See the Outbound Flow guide for the full walkthrough.

Example: Paginate Data

const runs = await filefeed.pipelineRuns.list({ status: 'completed', limit: 50 });
for (const run of runs.data) {
  let offset: number | null = 0;
  do {
    const page = await filefeed.pipelineRuns.getData({ pipelineRunId: run.id, limit: 1000, offset });
    // process page.data
    offset = page.data.length === 1000 ? (offset ?? 0) + page.data.length : null;
  } while (offset !== null);
}