Data Pipeline Workflow

workflow
flow
By Communityv1.2.1Updated 2026-03-02

A robust ETL data pipeline workflow for Aulay Flow. Extract data from multiple sources (APIs, databases, files), transform with configurable steps, and load into your destination. Includes schema validation, error handling, retry logic, and pipeline monitoring dashboards.

etl
data
pipeline
monitoring
Installation

Copy this configuration into your project:

# data-pipeline.flow.yaml
name: data-pipeline
trigger:
  cron: "0 2 * * *"
steps:
  - extract:
      sources:
        - type: api
          url: "$DATA_SOURCE_URL"
        - type: postgres
          connection: "$DB_URL"
  - transform:
      validate_schema: true
      steps: [deduplicate, normalize, enrich]
  - load:
      destination: "$WAREHOUSE_URL"
      mode: upsert

Details

Type
workflow
Platforms
flow
Version
1.2.1
Author
Community
Install method
copy config