We are currently looking for a remote Middle Big Data Engineer with experience in running, using and troubleshooting industry standard data technologies such as Spark, HDFS, Cassandra, Kafka to join our team.
The customer is an American multinational retail corporation that operates a chain of hypermarkets, discount department stores, and grocery stores.
This role will be responsible for utilizing industry best technologies and practices to enable analysis, intelligence, and processing of some of the largest datasets. As a Middle Big Data Engineer, this role will be counted on to have a deep understanding of technology internals in order to tune and troubleshoot individual jobs, as well as a high-level understanding of the landscape to drive value adding features to the platform.
responsibilities
- Collaborate with Product Owners and Team Leads to identify, design, and implement new features to support the growing real time data needs of the customer
- Identify and suggest or implement remediation of cases where we diverge from industry best practices
- Evangelize and practice an extremely high standard of code quality, system reliability, and performance to ensure SLAs are met for uptime, data freshness, data correctness, and quality
- Display sense of ownership over assigned work, requiring minimal direction and driving to completion in a sometimes fuzzy and uncharted environment
- Focus on enabling developers and analysts through self-service and automated tooling, rather than manual requests and acting as a gatekeeper
- Participate in on-call rotation, including continuously seeking to reduce noise, improve monitoring coverage, and improve quality-of-life for on-call engineers
requirements
- Experience in running, using and troubleshooting industry standard data technologies such as Spark, HDFS, Cassandra, Kafka
- Development experience, ideally in Scala but we are open to other experience if you’re willing to learn the languages we use
- Scripting skills i.e. Bash, Python, Ruby
- Experience processing large amounts of structured and unstructured data in streaming and batch
- Experience with cloud infrastructure. We use Azure, specifically, but any will do
- A focus on automation and providing leverage-based solutions to enable sustainable and scalable growth in an ever-changing ecosystem
- Experience building and maintaining a centralized platform or services, to be consumed by other teams, is ideal, but not necessary
- A passion for Operational Excellence and SRE/DevOps mindset, including an eye for monitoring, alerting, self-healing, and automation
- Experience in an Agile environment, able to manage scope and iterate quickly to consistently deliver value to the customer
- English level proficiency B1
benefits for locations
- Insurance Coverage
- Paid Leaves – including maternity, bereavement, paternity, and special COVID-19 leaves.
- Financial assistance for medical crisis
- Retiral Benefits – VPF and NPS
- Customized Mindfulness and Wellness programs
- EPAM Hobby Clubs
- Hybrid Work Model
- Soft loans to set up workspace at home
- Stable workload
- Relocation opportunities with ‘EPAM without Borders’ program
- Certification trainings for technical and soft skills
- Access to unlimited LinkedIn Learning platform
- Access to internal learning programs set up by world class trainers
- Community networking and idea creation platforms
- Mentorship programs
- Self-driven career progression tool
Find a vacancy that works for you. Send us your CV to receive a personalized offer.