- Design, develop, and maintain data pipelines, ETL processes, and data warehouses.
- Ensure data quality, integrity, and reliability throughout the data pipeline.
- Work with GCP services such as Google Cloud Storage, BigQuery, Dataflow, and Pub/Sub to architect and implement scalable and efficient data processing solutions.
- Monitor and troubleshoot data pipeline issues.
- Implement security and access controls to protect sensitive data and ensure compliance with data privacy regulations.
- Document data processes, data flows, and architectural decisions.