Freelance Big Data Solution Architect job | EPAM Anywhere

This website uses cookies for analytics, personalization and advertising. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.

Back icon

Big Data Solution Architect

Big Data Solution Architect 40 hrs/week, 12+ months

We are currently looking for a remote Big Data Solution Architect with more than 7 years of enterprise IT experience, background with different platforms, a strong focus on back-ends, high-load, real-time, Big Data, IoT, Cloud, and Analytics Solutions to join our team.

Please note that even though you are applying for this position, you may be offered other projects to join within EPAM Anywhere.

We accept CVs only in English. 

Responsibilities

  • Provide support existing and potential customers with security requirements capture, solutions architecture, system design and technical project management related to Big Data solutions
    • Lead and supervise Big Data platforms design and development
      • Establish project requirements
        • Create and present solution architecture documents with deep technical details to customer and implementation teams
          • Participate in technical meetings with customer representatives
            • Analyze, design, implement, deploy, troubleshoot and rebuild distributed Linux based platforms and Big Data solutions
              • Maintain a strong understanding of technical solutions and architecture design trends and best practices

                Requirements

                • Solution Architect with more than 7 years of enterprise IT experience, with background with different platforms and strong focus on back-ends, high-load, real-time, Big Data, IoT, Cloud, Analytics Solutions
                  • Core professional expertise includes: Platform Architecture, Data Pipelines Architecture, Infrastructure Deployment and Management, Security
                    • Able to support existing and potential customers with security requirements capture, solutions architecture, system design and technical project management related to Big Data solutions along with Cloud infrastructure design and development
                      • Capable to ramp-up Big Data teams, lead & supervise Big Data platforms design and development from requirements gathering to production support phases
                        • Driven towards success by a thirst for knowledge, staying expert in enterprise Data technology field
                          • Strong ‘hands-on’ experience as a Data Architect with design/development background with Java, Scala, or Python
                            • Architecture experience and practice in Data Management, Data Storage, Data Visualization, Disaster Recovery, Integration, Operation and Security
                              • Experience with building traditional Cloud Data Warehouses, Data Lakes
                                • Knowledge of high load and IoT Data Platform architectures and infrastructures
                                  • Wide experience in analysis, design, implementation, deployment as well as troubleshooting and rebuilding distributed Linux based platforms and Big Data solutions on premises and in Cloud
                                    • Strong Cloud experience in at least one of the Cloud providers (AWS/Azure/GCP)
                                      • Solid experience in continuous delivery tools and technologies
                                        • Broad experience with Containers and Resource Management systems: Docker, Mesos, Kubernetes/OpenShift, Yarn
                                          • Able to deliver Data Analytics projects and architecture guidelines
                                            • Strong in research, comparison and selection of tools/technologies/approaches to be used
                                              • Practical experience in performance tuning and optimization, bottleneck problem analysis
                                                • Strong communication skills, experienced in team coordination skills and solution implementation supervision
                                                  • Good in Agile development methodology, Scrum in particular
                                                    • Solid skills in business analysis, network/stack architecture, troubleshooting, support
                                                      • Fluent English

                                                        Technologies

                                                        • Programming Languages: Java/Scala/Python, SQL, Bash
                                                          • Big Data fundamentals
                                                            • Big Data stack: Hadoop, Yarn, HDFS, MapReduce, Hive, Spark, Kafka, Flume, Sqoop, ZooKeeper
                                                              • NoSQL: Cassandra, Hbase with superstructures (Phoenix/Tephra, Kylin), MongoDB
                                                                • Stream processing: Kafka Streams/Spark Streaming
                                                                  • Background in traditional data warehouse and business intelligence stacks (ETL, MPP Databases, Tableau, Microsoft Power BI, SAP Business Objects)
                                                                    • Data Visualization: Power BI, Tableau, QlikView
                                                                      • Operation: Cluster operation, Cluster planning
                                                                        • Flow Management: Apache Oozie, Informatica Big Data, Talend, Airflow
                                                                          • Search: Solr, Elasticsearch/ELK
                                                                            • In-Memory: Ignite, Redis
                                                                              • Cloud (at least one provider: AWS/Azure/GCP): Storage, Compute, Networking, Identity and Security, NoSQL, RDBMS and Cubes, Big Data Processing, Queues and Stream Processing, Serverless
                                                                                • Architecture concepts: Application Design, Integration Design, Layered Architecture, Synthesis of Solutions, Architecture Bugs evaluations

                                                                                  We offer

                                                                                  • Competitive compensation depending on experience and skills
                                                                                    • Work on enterprise-level projects on a long-term basis
                                                                                      • Full-time remote work
                                                                                        • Unlimited access to learning resources (EPAM training courses, English classes, Internal Library)
                                                                                          • Community of 38,000+ industry's top professionals
                                                                                            Big Data

                                                                                            40 hrs/week

                                                                                            Hours per week

                                                                                            12+ months

                                                                                            Project length

                                                                                            Belarus, Russia, Ukraine

                                                                                            Locations eligible for the position