Saltar al contenido principal
backir a búsqueda

senior data software engineer

bullets
Data Software Engineering, Databricks, Python, PySpark, Azure Data Factory
bullets
Poland

Join our remote team as a Senior Data Software Engineer within a global leader in providing innovative solutions for data management and analytics. We are looking for a highly skilled and experienced engineer who will be responsible for developing and implementing data solutions, as well as ensuring the quality, performance, and scalability of data systems. This role offers an opportunity to work with cutting-edge technologies and to contribute to the continued growth of our company and its clients.

responsibilities
  • Designing, developing, and implementing data solutions in collaboration with stakeholders
  • Ensuring the quality, performance, and scalability of data systems through regular maintenance and optimization
  • Identifying and resolving issues related to data quality, data processing, and data integration
  • Providing support for data-related issues and incidents
  • Collaborating with cross-functional teams to deliver customer-centric solutions
  • Defining and implementing data management best practices and standards
  • Creating and maintaining technical documentation for data systems and solutions
  • Ensuring compliance with data security and privacy regulations
  • Leading and mentoring less experienced team members to enhance their skills and grow their careers
  • Participating in code reviews and ensuring adherence to coding standards
  • Staying up-to-date with emerging data technologies and trends
requirements
  • At least 3+ years of solid/hands-on experience in Data Software Engineering
  • Expertise in PySpark for data processing and analysis
  • Experience working with Azure Data Factory for building and managing data pipelines
  • Advanced SQL knowledge for designing and managing database schema, including procedures, triggers, and views
  • Proficiency in Python for data manipulation and scripting
  • Experience with cloud-based data management tools such as hdinishght, Azure Data Lake, and Data API
  • Knowledge of Spark, Scala, and Kafka for distributed computing and real-time data processing
  • Expertise in Databricks for data engineering and analytics
  • Experience with EDL changes in DB Views/Stored procedures
  • Experience in data analysis and troubleshooting
  • Ability to provide integration testing support
  • Ability to plan and implement new requirements/data entities on EDL
  • Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
  • Experience with Hadoop, Hive, and MapReduce is a plus
  • Familiarity with AWS, GCP, or other cloud platforms is a plus

These jobs are for you

beneficios por ubicaciones

location.svg
Para ti

Descuentos en seguros de salud, clubes deportivos, centros comerciales, entradas de cine, etc.

Ingreso estable

Roles flexibles

Para tu trabajo cómodo

Trabajo remoto al 100% para siempre

Hardware de EPAM

Licencias de software de EPAM

Acceso a oficinas y espacios de coworking

Carga de trabajo estable

Oportunidades de reubicación

Modelos de participación flexibles

Para tu crecimiento

Formaciones gratuitas en habilidades técnicas y blandas

Acceso gratuito a la plataforma de aprendizaje de LinkedIn

Cursos de idiomas

Acceso gratuito a bibliotecas internas y externas

Oportunidades de certificación

Servicio de asesoramiento de habilidades

suscríbete a nuestros vacantesMantente al día con nuestras posiciones abierta para ingenieros en Control de Calidad, DevOps, Analista de Negocios y mucho más
hombre sonriente
¿buscas algo más?

Envíanos tu CV para recibir una oferta personalizada