middle data engineer
We are seeking a skilled remote Middle Data Engineer to join our team and work on the migration of Data Products pipelines from Oracle workloads to Databricks.
The successful candidate will be responsible for developing, monitoring, and operating the most critical curated data pipeline, which processes hundreds of millions of records to provide high-quality datasets for analytical and machine learning use cases.
The candidate should be able to consult with analysts, data scientists, and product managers to build and continuously improve the "Single Source of Truth" KPI for business steering, redevelop old legacy pipelines to new, advanced, and standard versions, and leverage and improve a cloud-based tech stack that includes AWS, Databricks, Kubernetes, Spark, Airflow, Python, and Scala.
If you are a skilled Data Engineer who can develop and operate critical data pipelines, continuously improve business KPIs, and leverage cloud-based tech stacks, we would love to hear from you.
- Develop, monitor, and operate the most critical curated data pipeline
- Work with analysts, data scientists, and product managers to build and continuously improve the "Single Source of Truth" KPI for business steering
- Redevelop old legacy pipelines to new, advanced, and standard versions that are easy to maintain and scalable for future demands
- Leverage and improve a cloud-based tech stack that includes AWS, Databricks, Kubernetes, Spark, Airflow, Python, and Scala
- Collaborate with a team of 3 engineers and 2 analysts, and a product manager to ensure successful project delivery
- Communicate and present solutions to stakeholders in a clear and concise manner
- 2+ years of relevant experience as a Data Software Engineer, or Big Data Engineer
- Expertise in Apache Spark along with Spark streaming
- Good hands-on experience with Databricks
- Fluency in the Scala programming language
- Good understanding and hands-on experience with CI/CD
- Rich working experience with GitHub
- Excellent communication skills and ability to communicate fluently with English-speaking stakeholders
- Strong team player with respectful communication and sharing responsibilities for the team's overall success
- Strong organizational skills and ability to prioritize tasks
- B2+ English level
nice to have
- Delta-Lake, expertise in SQL, fluency working with AWS landscape, ability to build Apache Airflow pipelines, Presto, Superset, Starburst, Oracle, and Exasol
benefits for locations
- Paid time off
- Paid sick leave days
- Medical insurance
- Stable income
- 100% remote work forever
- Free licensed software
- Possibility to work on your own device (BYOD)
- Stable workload
- Relocation opportunities
- Flexible engagement models
- Free trainings for technical and soft skills
- Free access to LinkedIn Learning platform
- Language courses
- Free access to internal and external e-Libraries
- Certification opportunities
- Skill advisory service
Find a vacancy that works for you. Send us your CV to receive a personalized offer.