Overview
The Senior Data Engineer will be essential in advancing the Data function at a high-growth start-up by focusing on optimizing data processing and delivery. The contractor will collaborate closely with cross-functional teams to develop and manage data pipelines, ensuring high-quality data is available for decision-making. This role involves working with modern technologies and will require a strong background in data engineering principles.
Responsibilities
- Develop and implement ETL pipelines using Python and SQL.
- Manage data processing workflows with Apache Airflow.
- Maintain and enhance the Data Warehouse architecture in Redshift and Snowflake.
- Ensure data quality and compliance through rigorous Data Quality Assurance practices.
- Collaborate with data analysts and business stakeholders to understand data requirements.
- Support the transition to cloud platforms such as AWS or GCP.
Requirements
- Minimum of 5 years experience in data engineering or related field.
- Proficiency in Python and SQL for data manipulation and querying.
- Experience with Apache Airflow for workflow management.
- Strong knowledge of Redshift and Snowflake for data warehousing solutions.
- Familiarity with data modeling and event-streaming technologies.
- Demonstrated understanding of data quality assurance processes.
- Experience working with cloud services (AWS or GCP) is preferred.