Ingestr is a command-line application that allows you to ingest data from any source into any destination using simple command-line flags, no code necessary.
- ✨ copy data from your database into any destination
- ➕ incremental loading:
append
,merge
ordelete+insert
- 🐍 single-command installation
ingestr takes away the complexity of managing any backend or writing any code for ingesting data, simply run the command and watch the data land on its destination.
pip install ingestr
ingestr ingest \
--source-uri 'postgresql://admin:admin@localhost:8837/web?sslmode=disable' \
--source-table 'public.some_data' \
--dest-uri 'bigquery://<your-project-name>?credentials_path=/path/to/service/account.json' \
--dest-table 'ingestr.some_data'
That's it.
This command will:
- get the table
public.some_data
from the Postgres instance. - upload this data to your BigQuery warehouse under the schema
ingestr
and tablesome_data
.
You can see the full documentation here.
Join our Slack community here.
Source | Destination | |
---|---|---|
Databases | ||
Postgres | ✅ | ✅ |
BigQuery | ✅ | ✅ |
Snowflake | ✅ | ✅ |
Redshift | ✅ | ✅ |
Databricks | ✅ | ✅ |
DuckDB | ✅ | ✅ |
Microsoft SQL Server | ✅ | ✅ |
Local CSV file | ✅ | ✅ |
MongoDB | ✅ | ❌ |
Oracle | ✅ | ❌ |
SAP Hana | ✅ | ❌ |
SQLite | ✅ | ❌ |
MySQL | ✅ | ❌ |
Platforms | ||
Google Sheets | ✅ | ❌ |
Notion | ✅ | ❌ |
Shopify | ✅ | ❌ |
More to come soon!
This project would not have been possible without the amazing work done by the SQLAlchemy and dlt teams. We relied on their work to connect to various sources and destinations, and built ingestr
as a simple, opinionated wrapper around their work.