Skip To Main Content
backgo to search

senior data software engineer

Data Software Engineering, Databricks, Microsoft Azure, PySpark, Python, SQL

Join our remote team as a Senior Data Software Engineer within a leading tech firm. In this role, you will be responsible for designing, building, and optimizing robust data pipelines to support our cutting-edge applications. The successful candidate will have deep expertise in one of the languages (Python, Spark, PySpark, SQL) and be able to build within dev and enable deployment to production. We are seeking a thoughtful and reliable engineer who is capable of tying together solutions across systems and delivering end-to-end production solutions.

  • Design and implement scalable data pipelines to support our cutting-edge applications
  • Ensure data quality and data accuracy across all stages of data processing
  • Collaborate with cross-functional teams to understand business requirements and develop solutions that meet their needs
  • Develop and maintain codebase in accordance with industry best practices and standards
  • Troubleshoot and resolve issues in a timely and effective manner
  • Optimize data processing algorithms and improve application performance
  • Ensure compliance with data security and data privacy regulations
  • Conduct code reviews and ensure high code quality and compliance with standards and guidelines
  • Participate in architectural and technical discussions to help shape the product roadmap
  • Stay up-to-date with emerging trends and technologies in data engineering and analytics
  • At least 3+ years of experience as a Data Software Engineer or in similar roles
  • Expertise in one of the languages (Python, Spark, PySpark, SQL) for building scalable and high-performance applications
  • Experience with Microsoft Azure for cloud-based infrastructure and application management
  • Experience using Databricks for building robust data pipelines
  • Experience using Azure DevOps, GitHub, or other version control systems
  • Familiarity with developing end-to-end production solutions
  • Ability to tie loose ends together for solutions across systems
  • Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
  • Experience with GCP and AWS cloud platforms
  • Experience with Apache Kafka and Apache Beam for building data pipelines
  • Experience with machine learning and data science tools and frameworks

benefits for locations

For you
  • Discounts on health insurance, sport clubs, shopping centers, cinema tickets, etc.
  • Stable income
  • Flexible roles
For your comfortable work
  • 100% remote work forever
  • EPAM hardware
  • EPAM software licenses
  • Access to offices and co-workings
  • Stable workload
  • Relocation opportunities
  • Flexible engagement models
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Language courses
  • Free access to internal and external e-Libraries
  • Certification opportunities
  • Skill advisory service
don't have time? Apply later!We send you a link to the job in your e-mail
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses