Overview
We are seeking a Senior Data Streaming Engineer for a 6-month contract with a dynamic e-commerce company transitioning to a streaming-first data architecture. In this remote role, the contractor will be instrumental in shaping the technical strategy and delivering solutions for a modern, cloud-native data platform, working extensively with high-scale systems to produce real-time insights across the business.
Responsibilities
- Design and implement Kafka pipelines and Databricks orchestration for real-time data processing.
- Define and execute the technical strategy to develop a cloud-native, streaming data platform.
- Collaborate with stakeholders to create actionable roadmaps and align technical solutions with business goals.
- Provide expertise in streaming tools and technologies such as Kafka, Flink, and Spark.
- Develop and optimize ETL/ELT pipelines and ensure best practices in data governance are followed.
- Engage in hands-on engineering tasks, including coding and system troubleshooting.
- Utilize Python, Scala, or Java for building scalable data solutions.
Requirements
- Proven experience as a Senior Data Engineer with a focus on real-time, distributed systems.
- In-depth knowledge of streaming tools like Kafka, Flink, and Spark.
- Experience with cloud platforms such as AWS, GCP, or Azure.
- Solid understanding of the differences between streaming and batch processing.
- Familiarity with data ecosystem tools like Airflow, Dagster, and DBT.
- Strong programming skills in Python, Scala, or Java.
- Background in developing ETL/ELT pipelines and data warehousing best practices.
- Experience handling privacy-sensitive data and implementing data governance frameworks.