backgo to search

senior data engineer

Data Software Engineering, Python, Amazon Web Services, Databricks, Apache Spark, CI/CD, SQL, Terraform
Sorry the job is no longer available.

Join our Corporate Data Engineering team as a Senior Data Engineer and contribute to the creation of cutting-edge data solutions and applications that drive essential business decisions across the organization. We are seeking a forward-thinking, detail-oriented individual who thrives on building scalable systems.

  • Collaborate with cross-functional teams to design, develop, and implement data solutions and applications that align with business needs
  • Build and maintain scalable data pipelines for efficient data collection, processing, and storage
  • Perform data modeling and transformation to facilitate data analysis and reporting
  • Implement best practices for data quality, security, and compliance
  • Contribute to the enhancement of data engineering processes through continuous improvement and adoption of CI/CD practices
  • Collaborate within an Agile environment, participating in sprint planning, daily stand-ups, and other Agile ceremonies
  • A minimum of 3 years of relevant experience in Data Engineering or a similar role, demonstrating your proficiency in developing and optimizing data solutions
  • Proficiency in Python, leveraging its capabilities to construct robust and scalable data pipelines
  • Strong familiarity with AWS, using its resources for efficient data processing and storage
  • Expertise in Databricks, contributing to efficient data processing and analysis
  • Proficiency in either Apache Spark or Hive, enabling you to work with large-scale data processing and analysis
  • Knowledge of CI/CD practices, promoting efficient and reliable software development processes
  • Proficiency in SQL, enabling you to query and manipulate data effectively
  • Experience with Terraform, enhancing your ability to manage and automate infrastructure
  • Strong understanding of Agile methodologies, enabling you to work collaboratively within dynamic teams
  • Fluent English communication skills at a B2+ level, facilitating effective collaboration and communication
nice to have
  • Familiarity with containerization and orchestration tools such as Docker and Kubernetes
  • Knowledge of streaming data processing technologies like Apache Kafka or Apache Flink
  • Experience with data warehousing solutions like Amazon Redshift or Snowflake
  • Understanding of machine learning concepts and their integration into data engineering pipelines

These jobs are for you

benefits for locations

For you
  • Insurance Coverage 
  • Paid Leaves – including maternity, bereavement, paternity, and special COVID-19 leaves. 
  • Financial assistance for medical crisis 
  • Retiral Benefits – VPF and NPS 
  • Customized Mindfulness and Wellness programs 
  • EPAM Hobby Clubs
For your comfortable work
  • Hybrid Work Model 
  • Soft loans to set up workspace at home 
  • Stable workload 
  • Relocation opportunities with ‘EPAM without Borders’ program

For your growth
  • Certification trainings for technical and soft skills 
  • Access to unlimited LinkedIn Learning platform 
  • Access to internal learning programs set up by world class trainers 
  • Community networking and idea creation platforms 
  • Mentorship programs 
  • Self-driven career progression tool

get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses