Skip To Main Content
backgo to search

senior data software engineer

bullets
Data Software Engineering, Databricks, Python, PySpark, Azure Data Factory
warning.png
Sorry the job is no longer available.

Join our remote team as a Senior Data Software Engineer within a global leader in providing innovative solutions for data management and analytics. We are looking for a highly skilled and experienced engineer who will be responsible for developing and implementing data solutions, as well as ensuring the quality, performance, and scalability of data systems. This role offers an opportunity to work with cutting-edge technologies and to contribute to the continued growth of our company and its clients.

responsibilities
  • Designing, developing, and implementing data solutions in collaboration with stakeholders
  • Ensuring the quality, performance, and scalability of data systems through regular maintenance and optimization
  • Identifying and resolving issues related to data quality, data processing, and data integration
  • Providing support for data-related issues and incidents
  • Collaborating with cross-functional teams to deliver customer-centric solutions
  • Defining and implementing data management best practices and standards
  • Creating and maintaining technical documentation for data systems and solutions
  • Ensuring compliance with data security and privacy regulations
  • Leading and mentoring less experienced team members to enhance their skills and grow their careers
  • Participating in code reviews and ensuring adherence to coding standards
  • Staying up-to-date with emerging data technologies and trends
requirements
  • At least 3+ years of solid/hands-on experience in Data Software Engineering
  • Expertise in PySpark for data processing and analysis
  • Experience working with Azure Data Factory for building and managing data pipelines
  • Advanced SQL knowledge for designing and managing database schema, including procedures, triggers, and views
  • Proficiency in Python for data manipulation and scripting
  • Experience with cloud-based data management tools such as hdinishght, Azure Data Lake, and Data API
  • Knowledge of Spark, Scala, and Kafka for distributed computing and real-time data processing
  • Expertise in Databricks for data engineering and analytics
  • Experience with EDL changes in DB Views/Stored procedures
  • Experience in data analysis and troubleshooting
  • Ability to provide integration testing support
  • Ability to plan and implement new requirements/data entities on EDL
  • Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
  • Experience with Hadoop, Hive, and MapReduce is a plus
  • Familiarity with AWS, GCP, or other cloud platforms is a plus

These jobs are for you

benefits for locations

poland.svg
ImageImage
For you
  • Discounts on health insurance, sport clubs, shopping centers, cinema tickets, etc.
  • Stable income
  • Flexible roles
ImageImage
For your comfortable work
  • 100% remote work forever
  • EPAM hardware
  • EPAM software licenses
  • Access to offices and co-workings
  • Stable workload
  • Relocation opportunities
  • Flexible engagement models
ImageImage
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Language courses
  • Free access to internal and external e-Libraries
  • Certification opportunities
  • Skill advisory service
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses