Skip To Main Content
backgo to search

senior data software engineer

bullets
Data Software Engineering, Databricks, Python, PySpark, Microsoft Azure, SQL Azure

Join our remote team as a Senior Data Software Engineer within a global leader in digital transformation and technology services. We are looking for an experienced software developer with a strong background in data engineering to join our team. The ideal candidate will be responsible for implementing reusable DataBricks components for data ingestion and data analytics. They will have experience ingesting data into a data lake via batch, streaming, and replication and making data available for reporting and predictive modeling, as well as establishing security controls, integration with data governance, and clear auditable data lineage. The Senior Data Software Engineer will work collaboratively with architects, technical leads, and key individuals within other functional groups to develop and test solutions that meet best practice specifications.

responsibilities
  • Implementing reusable DataBricks components for data ingestion and data analytics
  • Ingesting data into a data lake via batch, streaming, and replication and making data available for reporting and predictive modeling
  • Establishing security controls, integration with data governance, and clear auditable data lineage
  • Collaborating with architects, technical leads, and key individuals within other functional groups to develop and test solutions that meet best practice specifications
  • Participating in code review and testing solutions to ensure they meet best practice specifications
  • Writing project documentation to ensure that other developers can easily understand and use the code
  • Building collaborative partnerships with architects, technical leads, and key individuals within other functional groups
  • Continuously updating skills and knowledge to keep up with industry trends and best practices
requirements
  • At least 3+ years of experience in software engineering with a focus on data engineering
  • Proficient in Python coding and PySpark development
  • Experience building data ingestion pipelines, Data Warehouse or Database architecture.
  • Hands-on experience with modern Big Data components like Databricks, SQL Azure, and Microsoft Azure
  • Experience in designing, deploying, and administering scalable, available, and fault-tolerant systems in a cloud environment
  • Strong ability to write clean, maintainable, and well-documented code
  • Experience in data modeling and working with data-oriented personality and compliance awareness, such as PI, GDPR, and HIPAA
  • Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
  • Experience with Power BI, Azure Synapse Analytics, and ADLS is a plus

benefits for locations

poland.svg
ImageImage
For you
  • Discounts on health insurance, sport clubs, shopping centers, cinema tickets, etc.
  • Stable income
  • Flexible roles
ImageImage
For your comfortable work
  • 100% remote work forever
  • EPAM hardware
  • EPAM software licenses
  • Access to offices and co-workings
  • Stable workload
  • Relocation opportunities
  • Flexible engagement models
ImageImage
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Language courses
  • Free access to internal and external e-Libraries
  • Certification opportunities
  • Skill advisory service
don't have time? Apply later!We send you a link to the job in your e-mail
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses