Skip To Main Content
backgo to search

senior data software engineer

Data Software Engineering, Databricks, Python, PySpark, Microsoft Azure, SQL Azure

Join our remote team as a Senior Data Software Engineer within a global leader in data-driven solutions. We are actively seeking a hands-on and deeply technical engineer to collaborate closely with development peers, product leadership, and other technical staff to create innovative and impactful solutions. This role offers an opportunity to contribute significantly to the design, development, and optimization of features in a dynamic Agile development environment.

  • Developing a POC for replacement of the external vendor inside the infrastructure
  • Setting up required Azure services
  • Extracting data from data lake (EDL)
  • Processing data based on application requirements and architecture
  • Building the service layer to mimic the current existing application
  • Prioritizing and ensuring high-quality standards at every stage of development
  • Guaranteeing reliability, availability, performance, and scalability of systems
  • Collaborating with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.
  • Providing technical input for new feature requirements, partnering with business owners and architects
  • Ensuring continuous improvement by staying abreast of industry trends and emerging technologies
  • At least 3+ years of production experience in Data Software Engineering
  • Be hands-on with deep expertise in PySpark and Python for Data Engineering
  • Experience with Azure technologies, including SQL Azure and Microsoft Azure for building scalable data solutions
  • Solid experience working with Databricks for building data pipelines
  • Knowledge of Python web services like Django and Flask
  • Advanced SQL data processing experience for designing and managing database schema, including procedures, triggers, and views
  • Support applications and systems in a production environment, ensuring timely resolution of issues
  • Reviewing requirements and translating them into a documented technical design for implementation
  • Expertise in creating data POCs
  • Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
  • Experience with Big Data technologies like Hadoop and Spark Streaming is a plus
  • Familiarity with machine learning concepts and libraries such as Scikit-learn is an advantage

benefits for locations

For you
  • Insurance Coverage 
  • Paid Leaves – including maternity, bereavement, paternity, and special COVID-19 leaves. 
  • Financial assistance for medical crisis 
  • Retiral Benefits – VPF and NPS 
  • Customized Mindfulness and Wellness programs 
  • EPAM Hobby Clubs
For your comfortable work
  • Hybrid Work Model 
  • Soft loans to set up workspace at home 
  • Stable workload 
  • Relocation opportunities with ‘EPAM without Borders’ program

For your growth
  • Certification trainings for technical and soft skills 
  • Access to unlimited LinkedIn Learning platform 
  • Access to internal learning programs set up by world class trainers 
  • Community networking and idea creation platforms 
  • Mentorship programs 
  • Self-driven career progression tool

don't have time? Apply later!We send you a link to the job in your e-mail
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses