Skip To Main Content
backgo to search

senior data software engineer

bullets
Data Software Engineering, Databricks, Python, PySpark, Microsoft Azure, SQL Azure
warning.png
Sorry the job is no longer available.

Join our remote team as a Senior Data Software Engineer within a global leader in providing solutions for complex data needs. We are currently looking for an individual who can work closely with the architects, technical leads, and other key individuals within our functional groups to design, develop and implement reusable DataBricks components for data ingestion and analytics. As a Senior Data Software Engineer, you will be responsible for ensuring that data is ingested via batch, streaming, or replication into a data lake, establishing security controls, and ensuring integration with data governance. You will also be responsible for building collaborative partnerships with stakeholders to ensure that the data is available for reporting and predictive modeling.

responsibilities
  • Design and develop reusable DataBricks components for data ingestion and analytics
  • Collaborate with architects, technical leads, and other key individuals within our functional groups to deliver customer-centric solutions.
  • Establish security controls and ensure integration with data governance to achieve clear auditable data lineage
  • Participate in code review and test solutions to ensure they meet best practice specifications
  • Write project documentation for all phases of the software development lifecycle
  • Create and maintain technical documentation for the data ingestion pipelines, Data Warehouse or Database architecture
  • Work with stakeholders to ensure data availability for reporting and predictive modeling
  • Ensure continuous improvement by staying abreast of industry trends and emerging technologies
  • Drive the implementation of solutions aligned with business objectives.
  • Mentor and guide less experienced team members, helping them enhance their skills and grow their careers
  • Collaborate with cross-functional teams to achieve project goals
requirements
  • 3+ years of experience in building data ingestion pipelines, Data Warehouse or Database architecture
  • Expertise in Python and PySpark for data processing and analysis
  • Experience with DataBricks for building scalable and high-performance applications
  • Hands-on experience with SQL Azure for designing and managing database schema, including procedures, triggers, and views
  • Experience with Microsoft Azure for designing, deploying and administering scalable, available and fault-tolerant systems
  • Familiarity with data modeling and modern Big Data components
  • Experience with ADLS, Power BI, Azure Synapse Analytics for cloud-based infrastructure and application management
  • Familiarity with compliance awareness such as PI, GDPR, HIPAA
  • Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
  • Experience with other Cloud platforms such as AWS and GCP is a plus

These jobs are for you

benefits for locations

india.svg
For you
  • Insurance Coverage 
  • Paid Leaves – including maternity, bereavement, paternity, and special COVID-19 leaves. 
  • Financial assistance for medical crisis 
  • Retiral Benefits – VPF and NPS 
  • Customized Mindfulness and Wellness programs 
  • EPAM Hobby Clubs
For your comfortable work
  • Hybrid Work Model 
  • Soft loans to set up workspace at home 
  • Stable workload 
  • Relocation opportunities with ‘EPAM without Borders’ program

For your growth
  • Certification trainings for technical and soft skills 
  • Access to unlimited LinkedIn Learning platform 
  • Access to internal learning programs set up by world class trainers 
  • Community networking and idea creation platforms 
  • Mentorship programs 
  • Self-driven career progression tool

get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses