Overview
The Senior Data Engineer will play a critical role in enhancing the data function of a high-growth start-up by developing and maintaining data pipelines and ensuring data quality. This position requires collaboration with cross-functional teams to optimize data storage and processing strategies using various technologies. The contractor will work primarily on-site one day a week, while remote work is permitted for the remainder of the week, focusing on leveraging modern data technologies like Python, SQL, and AWS/GCP.
Responsibilities
- Develop and maintain ETL pipelines for data ingestion and processing.
- Ensure data quality and integrity across data platforms.
- Collaborate with stakeholders to understand data requirements and deliver insights.
- Implement data warehouse models using Snowflake and Redshift.
- Utilize Airflow for workflow orchestration and scheduling.
- Monitor and optimize performance of data systems.
- Support event-stream processing and micro-batching initiatives.
Requirements
- Proven experience as a Data Engineer with expertise in Python and SQL.
- Strong knowledge of ETL processes and data pipeline construction.
- Experience with Airflow, BigQuery, Redshift, and Snowflake.
- Familiarity with AWS or GCP cloud services.
- Demonstrated experience in Data Quality Assurance practices.
- Understanding of data warehouse modeling concepts.
- Ability to work in a hybrid environment with on-site presence as required.