Skip To Main Content
backgo to search

senior data software engineer

bullets
Data Software Engineering, Databricks, Microsoft Azure, PySpark, Python, SQL

Join our remote team as a Senior Data Software Engineer to work on exciting projects involving Databricks workflows, APIs, and analytical development. You will be an integral part of our team and must possess expertise in building and deploying production solutions using Azure, Python, Spark, PySpark, and SQL. We require a thoughtful and knowledgeable candidate who can create robust data pipelines and tie together loose ends for solutions across systems.

responsibilities
  • Collaborate with stakeholders to design, develop, and deploy data pipelines for various use cases
  • Develop and maintain robust ETL processes to extract, transform, and load data from various sources
  • Design and develop data models to support reporting and analytics use cases
  • Ensure data quality and reliability by implementing data validation and error handling mechanisms
  • Optimize data pipelines for performance and scalability
  • Troubleshoot and resolve data pipeline issues in a timely manner
  • Develop and maintain technical documentation for data pipelines and data processes
  • Contribute to the development of data engineering best practices and standards
requirements
  • At least 3 years of relevant experience in data software engineering and development
  • Proven track record of building end-to-end production solutions
  • Expertise in one of the languages (Python, Spark, PySpark, SQL) with knowledge on how to build both within dev and enabling deployment to production
  • Experience using Databricks for data engineering
  • Ability to analyze, troubleshoot, and optimize data pipelines for performance and reliability
  • Experience working on one or more cloud platforms (Azure, GCP, AWS), with a strong preference for Azure
  • Ability to design and develop robust, scalable, and efficient data pipelines
  • Experience using Azure DevOps, GitHub, (or others) and version control
  • Ability to write and maintain technical documentation for data pipelines and data processes
  • Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
  • Experience with RESTful APIs and other data transfer protocols
  • Experience with data visualization tools like Power BI
  • Experience with Agile software development methodologies

benefits for locations

poland.svg
ImageImage
For you
  • Discounts on health insurance, sport clubs, shopping centers, cinema tickets, etc.
  • Stable income
  • Flexible roles
ImageImage
For your comfortable work
  • 100% remote work forever
  • EPAM hardware
  • EPAM software licenses
  • Access to offices and co-workings
  • Stable workload
  • Relocation opportunities
  • Flexible engagement models
ImageImage
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Language courses
  • Free access to internal and external e-Libraries
  • Certification opportunities
  • Skill advisory service
don't have time? Apply later!We send you a link to the job in your e-mail
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses