Remote Solution Architect work | EPAM Anywhere

This website uses cookies for analytics, personalization and advertising. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.

Back icon

Senior Big Data Architect

Senior Big Data Architect 40 hrs/week, 12+ months

Striving for excellence is in our DNA. Since 1993, we have been helping the world’s leading companies imagine, design, engineer, and deliver software and digital experiences that change the world. We are more than just specialists, we are experts.


We are looking for a Solution Architect with more than 10 years’ of IT experience, solid background with different platforms and strong focus on back-ends / high-load / real-time / Big Data / IoT / Cloud / Analytics Solutions.

This position is a part of our new EPAM Anywhere program for remote workers. EPAM Anywhere offers a variety of IT jobs for remote workers. Join us to work on ambitious and long-term projects, get a stable workload, and enjoy a work-life balance!


  • Work closely with business in identifying solution requirements and key case-studies/scenarios to architect data solution for business transformation
    • Lead Data and Big Data initiatives on enterprise level as adept of data technologies
      • Design data analytics solutions by utilizing the data technology stack with Big Data techniques
        • Conduct solution architecture review/audit, calculate and present ROI
          • Create and present solution architecture documents with deep technical details to customer and implementation teams
            • Participate in the full cycle of “pre-sale” activities to prepare technical proposals on customer requests: direct communications with customers, RFP processing, the development of proposals for implementation and design of the solution, presentation for proposed solution architecture to the customer and participate in technical meetings with customer representatives
              • Lead implementation of the solutions from establishing project requirements and goals to solution \"go-live\"
                • Maintain a strong understanding of industry trends and best practices
                  • Stay on cutting edge of data technologies and be the first one to propose & implement new technologies for the best business value of EPAM and its customers
                    • Constantly grow expertise by gathering and monitoring available EPAM project experience and ongoing projects with clients from different business domains to drive further EPAM business in data field
                      • Create and follow personal education plan in the technology stack and solution architecture
                        • Share experience, knowledge and vision with EPAM, colleagues and customer teams. Participate in seminars, meet ups, mentoring, training programs. Prepare white papers and presentations


                          • Strong ‘hands-on’ experience as a Big Data Architect with a solid design/development background with Java, Scala, or Python
                            • Architecture experience and practice in Data Management, Data Storage, Data Visualization, Disaster Recovery, Integration, Operation, Presale Support, Security
                              • Vast experience in high load and IoT Data Platform architectures and infrastructures
                                • Wide experience in analysis, design, implementation, deployment as well as troubleshooting and rebuilding distributed Linux based platforms and Big Data solutions on premises and in cloud
                                  • Strong cloud experience (AWS, Azure, GCP) with architecting and leading the creation of automation frameworks to deploy Big Data and associated components as a service
                                    • Solid experience in continuous delivery tools and technologies
                                      • Extensive experience with Containers and Resource Management systems: Docker, Mesos, Kubernetes/OpenShift, Yarn
                                        • Experience and hands-on with TB & PB scale analytics platform, understanding and theoretical knowledge for developing and building such kind of platforms
                                          • Experience in direct customer communications and pre-selling business-consulting engagements to clients within large enterprise environments
                                            • Experience delivering data analytics projects and architecture guidelines
                                              • Strong in research, comparison and selection of tools / technologies / approaches to be used
                                                • Practical experience in performance tuning and optimization, bottleneck problem analysis
                                                  • Strong communication skills, experienced in team coordination skills and solution implementation supervision
                                                    • Good in Agile development methodology, Scrum in particular
                                                      • Solid skills in business analysis, network/stack architecture, troubleshooting, support
                                                        • Experienced in different business domains
                                                          • Fluent English - B2 and higher

                                                            Nice to have

                                                            • Programming Languages: Java/ Scala; Python; SQL; Bash
                                                              • Big Data stack: Hadoop, Yarn, HDFS, MapReduce, Hive, Spark, Kafka, Flume, Sqoop, Zookeper
                                                                • NoSQL: Cassandra and HBase & superstructures (Phoenix/Tephra, TitanDB/JanusGraph/Tinkerpop/Gremlin, OpenTSDB, Kylin); CosmosDB; MongoDB
                                                                  • Queues and Stream processing: Kafka Streams; Flink; Spark Streaming; Storm; Beam; Event Hub; IOT Hub; MQTT; Storage Queues; Service Bus; Stream Analytics
                                                                    • Background in traditional data warehouse and business intelligence stacks (ETL, MPP Databases, Tableau, Microsoft Power BI, SAP Business Objects)
                                                                      • Data Visualization: Power BI, Tableau, QlikView
                                                                        • Operation: Cluster operation, Cluster planning
                                                                          • Flow Management: Apache Oozie, Informatica Big Data, Talend, Airflow, Streamset, NiFi
                                                                            • Search: Solr, Elasticsearch/ELK
                                                                              • InMemory: Ignite, Redis, Druid
                                                                                • Solid Cloud experience with 2 or more leading cloud providers (AWS/Azure/GCP) and reference architectures: Storage; Compute; Networking; Identity and Security; NoSQL; RDBMS and Cubes; Big Data Processing; Queues and Stream Processing; Serverless; Data Analisis and Visualization; ML
                                                                                  • Big Data Architecture design patterns: Lambda, Kappa, Data Lake, Enterprise Data Warehouse; Event Grids
                                                                                    • Architecture concepts: Application Design; Integration Design; Layered Architecture; Synthesis of Solutions; Architecture Bugs evaluations

                                                                                      We offer

                                                                                      • Competitive compensation depending on experience and skills
                                                                                        • Work on enterprise-level projects on a long-term basis
                                                                                          • Full-time remote work
                                                                                            • Unlimited access to learning resources (EPAM training courses, English classes, Internal Library)
                                                                                              • Community of 38,000+ industry's top professionals
                                                                                                Big Data

                                                                                                40 hrs/week

                                                                                                Hours per week

                                                                                                12+ months

                                                                                                Project length

                                                                                                Belarus, Georgia, Kazakhstan, Russia, Ukraine

                                                                                                Locations eligible for the position