Remote Middle Big Data Software Engineer job | EPAM Anywhere

This website uses cookies for analytics, personalization and advertising. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.

Back icon

Middle Big Data Software Engineer

Middle Big Data Software Engineer 40 hrs/week, 12+ months

We are currently looking for a remote Middle Big Data Software Engineer with 4+ years of experience with Java, Scala or Python to join our team.

The customer provides Direct-to-Consumer services encompassing global entertainment and news TV properties, TV stations group, and radio businesses.

Please note that even though you are applying for this position, you may be offered other projects to join within EPAM Anywhere.

We accept CVs only in English.

Responsibilities

  • Ensure high quality performance by implementing and refining robust data processing using Java, Scala or Python
    • Build scalable analytics solution, including data processing, storage and serving large-scale data through batch and stream
      • Contribute to making our data platform more scalable, resilient and reliable
        • Participate in code review sessions
          • Help operationalize machine learning models and build apps

            Requirements

            • 4+ years of experience with Java, Scala or Python
              • Data engineering skills (data ingestion, storage and processing) in batch and streaming solutions using Kafka and Spark
                • Understanding and practical experience with AWS
                  • Knowledge of big data framework such as Hadoop & Apache Spark, NoSQL systems such as Cassandra or DynamoDB, streaming technologies such as Apache Kafka
                    • Understand reactive programming and dependency injection such as Spring to develop REST services
                      • Experience working with data scientists to operationalize machine learning models and build apps to make use of power of machine learning
                        • Experience with newer technologies relevant to the data space such as Spark, Kafka, Apache Druid (or any other OLAP databases)
                          • Good problem-solving skills
                            • Spoken and written English level - B1 and higher

                              We offer

                              • Competitive compensation depending on experience and skills
                                • Work on enterprise-level projects on a long-term basis
                                  • Full-time remote work
                                    • Unlimited access to learning resources (EPAM training courses, English classes, Internal Library)
                                      • Community of 38,000+ industry's top professionals
                                        Big Data

                                        40 hrs/week

                                        Hours per week

                                        12+ months

                                        Project length

                                        Belarus, Brazil, Chile, Colombia, Russia, Ukraine

                                        Locations eligible for the position