capitalone/locopy
locopy: Loading/Unloading to Redshift and Snowflake using Python.
This tool helps data professionals move data between their local systems, Amazon S3, and data warehouses like Amazon Redshift or Snowflake. You can take data from a local file or a database query and load it into a data warehouse, or conversely, extract data from a warehouse into a local file. It's designed for data engineers, data analysts, or anyone responsible for ETL processes.
115 stars and 3,492 monthly downloads. Used by 1 other package. Available on PyPI.
Use this if you need to reliably transfer large datasets between your Python environment and Amazon Redshift or Snowflake, leveraging S3 for efficient staging.
Not ideal if you are looking for a visual, drag-and-drop ETL tool or if your primary data sources are not Redshift, Snowflake, or S3.
Stars
115
Forks
49
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 28, 2026
Monthly downloads
3,492
Commits (30d)
0
Dependencies
6
Reverse dependents
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/data-engineering/capitalone/locopy"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
PrefectHQ/prefect
Prefect is a workflow orchestration framework for building resilient data pipelines in Python.
growthbook/growthbook
Open Source Feature Flags, Experimentation, and Product Analytics
koopjs/koop
Transform, query, and download geospatial data on the web.
pathwaycom/pathway
Python ETL framework for stream processing, real-time analytics, LLM pipelines, and RAG.
dagster-io/dagster
An orchestration platform for the development, production, and observation of data assets.