Data pipelines is a series of processes (or tasks) used to move data from a source to a destination, often transforming it along the way.
They are usually run at regular intervals - hourly, daily, monthly, quarterly, etc.
If you need something like this, I have experience with building batch data pipelines using:
- Apache Airflow
- Python Celery