Skip To Main Content
backgo to search

senior data software engineer

bullets
Data Software Engineering, Databricks, Python, PySpark, Microsoft Azure, SQL Azure
warning.png
Sorry the job is no longer available.

Join our remote team as a Senior Data Software Engineer within a global leader in data-driven solutions. We are seeking a highly skilled and experienced individual to take ownership of building and implementing reusable DataBricks components for data ingestion and analytics. The successful candidate will collaborate closely with architects, technical leads, and other functional groups to ensure the creation of effective and efficient solutions. This role presents an opportunity to drive innovation and contribute to the optimization of the company's data solutions.

responsibilities
  • Design and implement reusable DataBricks components for data ingestion and analytics
  • Ingest data via batch, streaming, and replication into data lake and make data available for reporting and predictive modeling
  • Establish security controls, integration with data governance and clear auditable data lineage
  • Build collaborative partnerships with architects, technical leads, and key individuals within other functional groups
  • Participate in code review and test solutions to ensure they meet best practice specifications
  • Write project documentation
  • Provide technical input for new feature requirements, partnering with business owners and architects
  • Ensure continuous improvement by staying abreast of industry trends and emerging technologies
  • Drive the implementation of solutions aligned with business objectives
  • Collaborate with cross-functional teams to achieve project goals
requirements
  • At least 3+ years of experience in Data Software Engineering
  • Expertise in Databricks and PySpark for building and managing Big Data analytics applications
  • Hands-on experience with Microsoft Azure, including Azure Synapse Analytics, SQL Azure, and ADLS for designing and deploying scalable, available and fault-tolerant systems
  • Experience in building data ingestion pipelines, data warehousing or database architecture
  • Deep understanding of data modeling concepts and experience with modern Big Data components
  • Strong coding experience with Python for building data solutions
  • Understanding of compliance awareness, such as PI, GDPR, HIPAA for data security and privacy
  • Experience in actively participating in code review and testing solutions to ensure they meet best practice specifications
  • Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
  • Experience with Power BI for data visualization and reporting

These jobs are for you

benefits for locations

poland.svg
ImageImage
For you
  • Discounts on health insurance, sport clubs, shopping centers, cinema tickets, etc.
  • Stable income
  • Flexible roles
ImageImage
For your comfortable work
  • 100% remote work forever
  • EPAM hardware
  • EPAM software licenses
  • Access to offices and co-workings
  • Stable workload
  • Relocation opportunities
  • Flexible engagement models
ImageImage
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Language courses
  • Free access to internal and external e-Libraries
  • Certification opportunities
  • Skill advisory service
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses