Freelance Middle Big Data Engineer work | EPAM Anywhere

This website uses cookies for analytics, personalization and advertising. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.

Back icon

Middle Big Data Engineer for a Multinational Retail Company

Middle Big Data Engineer for a Multinational Retail Company 40 hrs/week, 12+ months

We are currently looking for a remote Middle Big Data Engineer with experience in running, using and troubleshooting industry standard data technologies such as Spark, HDFS, Cassandra, Kafka to join our team.

The customer is an American multinational retail corporation that operates a chain of hypermarkets, discount department stores, and grocery stores.

This role will be responsible for utilizing industry best technologies and practices to enable analysis, intelligence, and processing of some of the largest datasets. As a Middle Big Data Engineer, this role will be counted on to have a deep understanding of technology internals in order to tune and troubleshoot individual jobs, as well as a high-level understanding of the landscape to drive value adding features to the platform.

Please note that even though you are applying for this position, you may be offered other projects to join within EPAM Anywhere.

We accept CVs only in English.

Responsibilities

  • Collaborate with Product Owners and Team Leads to identify, design, and implement new features to support the growing real time data needs of the customer
    • Identify and suggest or implement remediation of cases where we diverge from industry best practices
      • Evangelize and practice an extremely high standard of code quality, system reliability, and performance to ensure SLAs are met for uptime, data freshness, data correctness, and quality
        • Display sense of ownership over assigned work, requiring minimal direction and driving to completion in a sometimes fuzzy and uncharted environment
          • Focus on enabling developers and analysts through self-service and automated tooling, rather than manual requests and acting as a gatekeeper
            • Participate in on-call rotation, including continuously seeking to reduce noise, improve monitoring coverage, and improve quality-of-life for on-call engineers

              Requirements

              • Experience in running, using and troubleshooting industry standard data technologies such as Spark, HDFS, Cassandra, Kafka
                • Development experience, ideally in Scala but we are open to other experience if you’re willing to learn the languages we use
                  • Scripting skills i.e. Bash, Python, Ruby
                    • Experience processing large amounts of structured and unstructured data in streaming and batch
                      • Experience with cloud infrastructure. We use Azure, specifically, but any will do
                        • A focus on automation and providing leverage-based solutions to enable sustainable and scalable growth in an ever-changing ecosystem
                          • Experience building and maintaining a centralized platform or services, to be consumed by other teams, is ideal, but not necessary
                            • A passion for Operational Excellence and SRE/DevOps mindset, including an eye for monitoring, alerting, self-healing, and automation
                              • Experience in an Agile environment, able to manage scope and iterate quickly to consistently deliver value to the customer

                                We offer

                                • Competitive compensation depending on experience and skills
                                  • Work on enterprise-level projects on a long-term basis
                                    • Full-time remote work
                                      • Unlimited access to learning resources (EPAM training courses, English classes, Internal Library)
                                        • Community of 38,000+ industry's top professionals
                                          Big Data

                                          40 hrs/week

                                          Hours per week

                                          12+ months

                                          Project length

                                          Belarus, India, Russia, Ukraine

                                          Locations eligible for the position