Skip To Main Content
backgo to search

senior data software engineer

bullets
Data Software Engineering, Databricks, Python, PySpark, Microsoft Azure, SQL Azure

Join our remote team as a Senior Data Software Engineer within a global leader in data-driven solutions. We are actively seeking a hands-on and deeply technical engineer to collaborate closely with development peers, product leadership, and other technical staff to create innovative and impactful solutions. This role offers an opportunity to contribute significantly to the design, development, and optimization of features in a dynamic Agile development environment.

responsibilities
  • Developing a POC for replacement of the external vendor inside the infrastructure
  • Setting up required Azure services
  • Extracting data from data lake (EDL)
  • Processing data based on application requirements and architecture
  • Building the service layer to mimic the current existing application
  • Prioritizing and ensuring high-quality standards at every stage of development
  • Guaranteeing reliability, availability, performance, and scalability of systems
  • Collaborating with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.
  • Providing technical input for new feature requirements, partnering with business owners and architects
  • Ensuring continuous improvement by staying abreast of industry trends and emerging technologies
requirements
  • At least 3+ years of production experience in Data Software Engineering
  • Be hands-on with deep expertise in PySpark and Python for Data Engineering
  • Experience with Azure technologies, including SQL Azure and Microsoft Azure for building scalable data solutions
  • Solid experience working with Databricks for building data pipelines
  • Knowledge of Python web services like Django and Flask
  • Advanced SQL data processing experience for designing and managing database schema, including procedures, triggers, and views
  • Support applications and systems in a production environment, ensuring timely resolution of issues
  • Reviewing requirements and translating them into a documented technical design for implementation
  • Expertise in creating data POCs
  • Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
  • Experience with Big Data technologies like Hadoop and Spark Streaming is a plus
  • Familiarity with machine learning concepts and libraries such as Scikit-learn is an advantage

benefits for locations

poland.svg
ImageImage
For you
  • Discounts on health insurance, sport clubs, shopping centers, cinema tickets, etc.
  • Stable income
  • Flexible roles
ImageImage
For your comfortable work
  • 100% remote work forever
  • EPAM hardware
  • EPAM software licenses
  • Access to offices and co-workings
  • Stable workload
  • Relocation opportunities
  • Flexible engagement models
ImageImage
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Language courses
  • Free access to internal and external e-Libraries
  • Certification opportunities
  • Skill advisory service
don't have time? Apply later!We send you a link to the job in your e-mail
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses