Overview
The Senior Data Engineer will play a crucial role in the design, scaling, and maintenance of data pipelines and Lakehouse architectures. This position involves close collaboration with internal stakeholders and external partners to optimize existing systems, focusing on enhancing fan engagement through digital platforms.
Responsibilities
- Design and develop ETL/ELT pipelines in Azure and Databricks, ensuring reliability and performance.
- Construct Kimball-style dimensional models to support analytics and reporting.
- Implement automated testing for data quality assurance and validation.
- Ensure compliance with data governance, legal, and regulatory standards.
- Collaborate with the wider Data team to optimize pipelines and enhance platform capabilities.
Requirements
- Hands-on expertise with Databricks, PySpark, and Delta Lake.
- Proven ability to build production-grade ETL/ELT pipelines, including integration with SFTP and REST APIs.
- Strong knowledge of Kimball methodology within Lakehouse frameworks.
- Advanced proficiency in Azure data services (ADF, ADLS Gen2, Event Hubs) and SQL.
- Excellent analytical and troubleshooting skills for data transformation and quality assurance.
- Strong communication skills to translate technical concepts for stakeholders and collaborate across teams.