Skip to content
/ etl Public

Stream your Postgres data anywhere in real-time. Simple Rust building blocks for change data capture (CDC) pipelines.

License

Notifications You must be signed in to change notification settings

supabase/etl


ETL by Supabase

ETL

Build real-time Postgres replication applications in Rust
Documentation · Examples · Issues

ETL is a Rust framework by Supabase for building high‑performance, real‑time data replication apps on Postgres. It sits on top of Postgres logical replication and gives you a clean, Rust‑native API for streaming changes to your own destinations.

Highlights

  • 🚀 Real‑time replication: stream changes as they happen
  • ⚡ High performance: batching and parallel workers
  • 🛡️ Fault tolerant: retries and recovery built in
  • 🔧 Extensible: implement custom stores and destinations
  • 🧭 Typed, ergonomic Rust API

Get Started

Install via Git while we prepare for a crates.io release:

[dependencies]
etl = { git = "https://github.com/supabase/etl" }

Quick example using the in‑memory destination:

use etl::{
    config::{BatchConfig, PgConnectionConfig, PipelineConfig, TlsConfig},
    destination::memory::MemoryDestination,
    pipeline::Pipeline,
    store::both::memory::MemoryStore,
};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let pg = PgConnectionConfig {
        host: "localhost".into(),
        port: 5432,
        name: "mydb".into(),
        username: "postgres".into(),
        password: Some("password".into()),
        tls: TlsConfig { enabled: false, trusted_root_certs: String::new() },
    };

    let store = MemoryStore::new();
    let destination = MemoryDestination::new();

    let config = PipelineConfig {
        id: 1,
        publication_name: "my_publication".into(),
        pg_connection: pg,
        batch: BatchConfig { max_size: 1000, max_fill_ms: 5000 },
        table_error_retry_delay_ms: 10_000,
        max_table_sync_workers: 4,
    };

    let mut pipeline = Pipeline::new(config, store, destination);
    pipeline.start().await?;
    // pipeline.wait().await?; // Optional: block until completion

    Ok(())
}

For tutorials and deeper guidance, see the Documentation or jump into the examples.

Destinations

ETL is designed to be extensible. You can implement your own destinations to send data to any destination you like, however it comes with a few built in destinations:

  • BigQuery

Out-of-the-box destinations are available in the etl-destinations crate:

[dependencies]
etl = { git = "https://github.com/supabase/etl" }
etl-destinations = { git = "https://github.com/supabase/etl", features = ["bigquery"] }

License

Apache‑2.0. See LICENSE for details.


Made with ❤️ by the Supabase team

About

Stream your Postgres data anywhere in real-time. Simple Rust building blocks for change data capture (CDC) pipelines.

Topics

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

 

Packages

No packages published

Contributors 15

Languages