Big Data Development Team Lead remote jobs | EPAM Anywhere

This website uses cookies for analytics, personalization and advertising. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.

Back icon

Big Data Development Team Lead for a Delivery Services Company

Big Data Development Team Lead for a Delivery Services Company 40 hrs/week, 12+ months

We are looking for a remote Big Data Development Team Lead with 5+ years of experience in Data Intelligence and Big Data to join our team.

The customer is an American multinational delivery services company. It provides business and residential money-back guaranteed ground package delivery services.

In this position, you will act as the lead engineer for the team building the data integration solution.

Please note that even though you are applying for this position, you may be offered other projects to join within EPAM Anywhere.

We accept CVs only in English.

Responsibilities

  • Lead team and provide daily direction
    • Work with onsite and EPAM Project Managers, Solution Architect, and Data Engineers
      • Provide updates to key stakeholders
        • Help tracking project work and maintaining project scope
          • Provide technical suggestions and recommendations
            • Help establish best practices for engineering team
              • Perform code reviews for both EPAM and customer’s team

                Requirements

                • 5+ years of Data Intelligence and Big Data experience
                  • 1+ years of relevant leadership experience
                    • Knowledge of Java
                      • Experience with Maven, JUnit, Spring, Spring Boot
                        • Experience with Git
                          • Experience with Google Cloud Platform Pub/Sub and BigQuery service as a source and destination for data to be ingested and processed
                            • Knowledge of Google Cloud Dataflow service as computational engine
                              • Experience with Google Cloud Storage (GCS) as a data store for application artifacts
                                • Experience with Google Cloud Firestore as an application configuration store
                                  • Knowledge of Google Cloud Functions as an execution mechanism for the Dataflow jobs
                                    • Experience with Google Cloud Scheduler as a scheduling tool for batch Dataflow jobs
                                      • Familiarity with Google Stackdriver as a monitoring and alerting tool
                                        • Experience with Apache Beam as data transformation engine
                                          • Knowledge of Jenkins as the build & deploy automation tool

                                            We offer

                                            • Competitive compensation depending on experience and skills
                                              • Work on enterprise-level projects on a long-term basis
                                                • Full-time remote work
                                                  • Unlimited access to learning resources (EPAM training courses, English classes, Internal Library)
                                                    • Community of 38,000+ industry's top professionals
                                                      Big Data
                                                      Java
                                                      Cloud.Google

                                                      40 hrs/week

                                                      Hours per week

                                                      12+ months

                                                      Project length

                                                      Belarus, Brazil, Chile, Colombia, India, Russia, Ukraine

                                                      Locations eligible for the position