backgo to search

senior data software engineer

bullets
Data Software Engineering, Amazon Web Services, Apache Airflow, Apache Spark, CI/CD, Python, SQL
warning.png
Sorry the job is no longer available.

We are seeking a highly skilled Senior Data Software Engineer to join our team and provide critical support to our remote Data Science teams.

As a Senior Data Software Engineer, you will be responsible for building datamarts and providing ad-hoc support in a fast-paced, dynamic environment. You will work closely with Data Scientists and other stakeholders to understand their needs and develop solutions that meet their requirements.

responsibilities
  • Build datamarts and data pipelines to support the Data Science teams
  • Provide ad-hoc support to Data Scientists and other stakeholders, ensuring the seamless operation of data pipelines and processes
  • Collaborate with cross-functional teams to understand their needs and develop solutions that meet their requirements
  • Design, develop, and maintain efficient and scalable ETL processes
  • Optimize complex SQL queries and database operations
  • Ensure the implementation of CI/CD pipelines for data engineering tasks
  • Develop and maintain REST APIs for data processing and consumption
  • Collaborate with stakeholders to define project requirements and timelines
  • Provide technical guidance and mentorship to junior team members
requirements
  • Minimum of 4 years of experience as a Data Software Engineer, working with large datasets and complex data pipelines
  • Expertise in Amazon Web Services, specifically with services such as S3 and EC2
  • Advanced experience with Apache Airflow and Apache Spark for data processing and workflow orchestration
  • Proficient in Python programming language, with experience in writing efficient and scalable code
  • Strong understanding of SQL and relational databases, with experience in designing and optimizing complex queries
  • Experience in CI/CD pipelines, with experience in tools such as Jenkins or GitLab
  • Experience in PySpark and REST APIs for data engineering tasks
  • Strong understanding of ETL processes and data modeling
  • Excellent communication skills, with the ability to effectively collaborate with technical and non-technical stakeholders
  • Upper-intermediate English language proficiency, enabling clear communication and collaboration with the team and stakeholders
nice to have
  • Experience with Redshift and Databricks for data processing and analysis

These jobs are for you

benefits for locations

india.svg
For you
  • Insurance Coverage 
  • Paid Leaves – including maternity, bereavement, paternity, and special COVID-19 leaves. 
  • Financial assistance for medical crisis 
  • Retiral Benefits – VPF and NPS 
  • Customized Mindfulness and Wellness programs 
  • EPAM Hobby Clubs
For your comfortable work
  • Hybrid Work Model 
  • Soft loans to set up workspace at home 
  • Stable workload 
  • Relocation opportunities with ‘EPAM without Borders’ program

For your growth
  • Certification trainings for technical and soft skills 
  • Access to unlimited LinkedIn Learning platform 
  • Access to internal learning programs set up by world class trainers 
  • Community networking and idea creation platforms 
  • Mentorship programs 
  • Self-driven career progression tool

get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses