Senior Data Engineer (m/w)
What your first 6 months might look like:
1 month:
Get to know the data landscape and introduce yourself to various data teams you will be interacting with
Pick up and resolve your first change request
Participate in code reviews making sure that our code is well designed, tested, robust, secure, compliant, performant and readable
3 months:
Implement a new service or feature to our codebase
Act as a subject matter expert in supporting other domain teams data engineering efforts
6 months:
Introduce an important architectural improvement to our systems
Seek out for opportunities to simplify and streamline data management systems and processes
Your profile:
Working knowledge of building robust, fault-tolerant data ingestion and processing pipelines using technologies like Apache Kafka, Apache Beam, or similar
Strong programming skills in Python, Scala, or Java, with experience in building scalable, low latency data processing applications
You are knowledgeable and experienced in SQL programming
Experience with software engineering concepts/skills - git version control, testing, debugging, research, technical problem solving, continuous learning
You have experience with data warehousing, data modeling and ELT concepts
You have experience working with data infrastructure, storage, APIs, data pipelines, observability and workflow orchestration in distributed cloud environments
You are proactive and communicative -You are comfortable managing multiple concurrent tasks and have the capacity and willingness to completely own their lifecycle with minimal oversight
You work organized and agile, focusing on reproducibility and scalability in a dynamic business context
You are enthusiastic about innovating, excited to continuously learn and comfortable with ambiguity
You are curious and like to keep up with existing and new technology trends in the data space
Bonus:
Experience with modern data tools (our data stack is based on GCP and we use BigQuery, Airflow, dbt, Looker & Hex)
You have experience working with containerization and orchestration technology (Docker, Kubernetes) and infrastructure as a code frameworks (Terraform, Helm)
You have been actively involved in designing, building, and maintaining scalable, high-performance data pipelines to power real-time customer data analytics and insights
- Apache
- Senior
- Testing
- CLOUD
- Python
- Scala
- JAVA
- SQL