Explore EPAM's Open Remote Jobs | Remote Work | EPAM Anywhere

This website uses cookies for analytics, personalization and advertising. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.

Back icon

Lead Big Data Software Engineer with Scala Expertise

Lead Big Data Software Engineer with Scala Expertise 40 hrs/week, 12+ months

Currently we are looking for a remote Lead Big Data Software Engineer.  
Our customer exposes an API for other businesses to be able to access risks associated with a person, e.g. understand their credit score. Behind the scene there is a sophisticated decisioning system and large data volumes. Currently, back-end of this API has number of legacy versions, serving hundreds of clients, with individual installations for most clients.

We need to create a data lake for one of the biggest data analytics company working with personal information both domestically and internationally. In a nutshell, this includes replatforming of on-premise Enterprise Data Hub from Hadoop cluster into GCP. Day to day tasks include but not limited to creating spark application that manipulates with data from different sources including Oracle, Google Cloud Storage, BigQuery; creating pipelines via GCP Dataflow; working with Jenkins and AirFlow.

Please note that even though you are applying for this position, you still can be offered other projects to join within EPAM Anywhere. 

Requirements

  • 5+ years of experience as a Java Developer
    • Proficiency in data engineering
      • Expertise in Big Data: Hadoop, Spark, Kafka
        • Strong knowledge of Scala
          • Expertise in Microservices Architecture
            • Ability to work with high volumes of data
              • Experience in working with AWS
                • Experience in working with GCP: Dataproc (Apache Spark), Dataflow (Apache Beam), BigQuery, Cloud Storage
                  • Good understanding of Design Patterns, Clean Code, Unit testing
                    • Experience working in Agile environment
                      • Data modelling skills would be a plus
                        • Experience in Jenkins, AirFlow with Groovy and Python
                          • Excellent written and verbal communication skills
                            • Intermediate or higher English level, both verbal and written (B1+)

                              We offer

                              • Competitive compensation depending on experience and skills
                                • Work in enterprise-level projects long-term
                                  • Full-time remote work (you can work from anywhere you are)
                                    • Unlimited access to learning courses (LinkedIn learning, EPAM training courses, English regular classes, Internal Library)
                                      • Community of 36,700+ industry’s top professionals
                                        Big Data
                                        Scala

                                        40 hrs/week

                                        Hours per week

                                        12+ months

                                        Project length

                                        Belarus, Brazil, Chile, Colombia, India, Russia, Ukraine

                                        Locations eligible for the position