Skip To Main Content
backgo to search

senior data software engineer

bullets
Data Software Engineering, Databricks, Python, PySpark, Microsoft Azure, SQL Azure

Join our remote team as a Senior Data Software Engineer within a global leader in providing cutting-edge cloud-based solutions. We are actively seeking a hands-on and deeply technical developer to collaborate closely with development peers, product leadership, and other technical staff to create innovative and impactful solutions. This role offers an opportunity to contribute significantly to the design, development, and optimization of features in a dynamic Agile development environment, using technologies such as Python, PySpark, Microsoft Azure, SQL Azure, and Databricks.

responsibilities
  • Setting up required Azure services
  • Building and deploying POC for replacement of the external vendor inside of the infrastructure
  • Extracting data from data lake (EDL)
  • Processing data based on application requirements & architecture
  • Mimicking current existing application
  • Prioritizing and ensuring high-quality standards at every stage of development
  • Guaranteeing reliability, availability, performance, and scalability of systems
  • Collaborating with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.
  • Providing technical input for new feature requirements, partnering with business owners and architects
  • Ensuring continuous improvement by staying abreast of industry trends and emerging technologies
  • Actively contributing to architectural and technical discussions
requirements
  • At least 3+ years of production experience in Data Software Engineering
  • Be hands-on with deep expertise in Data Engineering, in both functional and non-functional areas
  • Deep expertise in PySpark for building scalable and high-performance data applications
  • Experience with Microsoft Azure for cloud-based infrastructure and application management
  • Familiarity with SQL Azure for designing and managing database schema, including procedures, triggers, and views
  • Exposure to Databricks for creating unified data analytics platforms
  • Knowledge of Python web services such as Django and Flask for building efficient APIs and web solutions
  • Support applications and systems in a production environment, ensuring timely resolution of issues
  • Expertise in build and test tools for managing build and testing processes
  • Excellent communication skills in spoken and written English at an Upper-intermediate level or higher
nice to have
  • Experience in Big Data technologies such as Hadoop, Spark, Kafka, and Hive is a plus

benefits for locations

poland.svg
ImageImage
For you
  • Discounts on health insurance, sport clubs, shopping centers, cinema tickets, etc.
  • Stable income
  • Flexible roles
ImageImage
For your comfortable work
  • 100% remote work forever
  • EPAM hardware
  • EPAM software licenses
  • Access to offices and co-workings
  • Stable workload
  • Relocation opportunities
  • Flexible engagement models
ImageImage
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Language courses
  • Free access to internal and external e-Libraries
  • Certification opportunities
  • Skill advisory service
don't have time? Apply later!We send you a link to the job in your e-mail
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses