Remote Lead Big Data Engineer work | EPAM Anywhere

This website uses cookies for analytics, personalization and advertising. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.

Back icon

Lead Big Data Engineer with Scala Expertise

Lead Big Data Engineer with Scala Expertise 40 hrs/week, 12+ months

We are looking for a remote Lead Big Data with 5+ years of experience as a Java Developer, proficiency in data engineering, expertise in Big Data - Hadoop, Spark, and Kafka, and with strong knowledge of Scala to join our team remotely.


Our customer exposes an API for other businesses to be able to access risks associated with a person, e.g. understand their credit score. Behind the scene there is a sophisticated decisioning system and large data volumes. Currently, back-end of this API has number of legacy versions, serving hundreds of clients, with individual installations for most clients.

We need to create a data lake for one of the biggest data analytics company working with personal information both domestically and internationally. In a nutshell, this includes replatforming of on-premise Enterprise Data Hub from Hadoop cluster into GCP. Day to day tasks include but not limited to creating spark application that manipulates with data from different sources including Oracle, Google Cloud Storage, BigQuery; creating pipelines via GCP Dataflow; working with Jenkins and AirFlow.

Please note that even though you are applying for this position, you may be offered other projects to join within EPAM Anywhere. 

Join EPAM Anywhere to quickly and easily find projects that match your knowledge and experience, while working with Forbes Global 2000 clients, building a successful IT career, and earning competitive rewards. The platform provides additional perks, including a flexible schedule, professional development opportunities and access to a community of experts.

Requirements

  • 5+ years of experience as a Java Developer
    • Proficiency in data engineering
      • Expertise in Big Data: Hadoop, Spark, Kafka
        • Strong knowledge of Scala
          • Expertise in Microservices Architecture
            • Ability to work with high volumes of data
              • Experience in working with AWS
                • Experience in working with GCP: Dataproc (Apache Spark), Dataflow (Apache Beam), BigQuery, Cloud Storage
                  • Good understanding of Design Patterns, Clean Code, Unit testing
                    • Experience working in Agile environment
                      • Data modelling skills would be a plus
                        • Experience in Jenkins, AirFlow with Groovy and Python
                          • Excellent written and verbal communication skills
                            • Intermediate or higher English level, both verbal and written (B1+)

                              We offer

                              • Competitive compensation depending on experience and skills
                                • Work on enterprise-level projects on a long-term basis
                                  • You will have a 100% remote full-time job
                                    • Unlimited access to learning resources (EPAM training courses, English classes, Internal Library)
                                      • Community of 38,000+ industry's top professionals
                                        Big Data
                                        Scala
                                        Java
                                        Jenkins
                                        Python

                                        40 hrs/week

                                        Hours per week

                                        12+ months

                                        Project length

                                        Belarus, Brazil, Chile, Colombia, India, Russia, Ukraine

                                        Locations eligible for the position