Freelance Big Data Architect | EPAM Anywhere

This website uses cookies for analytics, personalization and advertising. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.

Back icon

Big Data Architect

Big Data Architect 40 hrs/week, 12+ months

We are currently looking for a remote Big Data Architect with 7+ years of experience as a Data Architect with design/development background with Java/Scala or Python to join our team.

Please note that even though you are applying for this position, you may be offered other projects to join within EPAM Anywhere.

We accept CVs only in English.

Responsibilities

  • Work closely with business in identifying solution requirements and key case-studies/scenarios to architect data solution for business transformation
    • Participate in Data and Big Data initiatives on a company level
      • Design data analytics solutions by utilizing the data technology stack with Big Data techniques
        • Conduct solution architecture review/audit, calculate and present ROI
          • Create and present solution architecture documents with deep technical details to customer and implementation teams
            • Participate in the full cycle of “pre-sale” activities to prepare technical proposals on customer requests: direct communications with customers, RFP processing, the development of proposals for implementation and design of the solution, presentation for proposed solution architecture to the customer and participate in technical meetings with customer representatives
              • Lead implementation of the solutions from establishing project requirements and goals to solution “go-live”
                • Maintain a strong understanding of technical solutions and architecture design trends and best practices, stay on cutting edge of Data technologies
                  • Constantly grow expertise by gathering and monitoring available EPAM project experience and ongoing projects with clients from different business domains to drive further EPAM business in Data field
                    • Create and follow personal education plan in the technology stack and solution architecture
                      • Share experience, knowledge and vision with EPAM, colleagues and customer teams. Participate in seminars, meet ups, mentoring, training programs. Prepare white papers and presentations

                        Requirements

                        • 7+ years experience as a Data Architect with design/development background with Java/Scala or Python
                          • Architecture experience and practice in Data Management, Data Storage, Data Visualization, Disaster Recovery, Integration, Operation and Security
                            • Experience with building traditional Cloud Data Warehouses, Data Lakes
                              • Knowledge of high load and IoT Data Platform architectures and infrastructures
                                • Wide experience in analysis, design, implementation, deployment as well as troubleshooting and rebuilding distributed Linux based platforms and Big Data solutions on premises and in Cloud
                                  • Strong Cloud experience in at least one of the Cloud providers (AWS, Azure, GCP)
                                    • Solid experience in continuous delivery tools and technologies
                                      • Broad experience with Containers and Resource Management systems: Docker, Mesos, Kubernetes/OpenShift, Yarn
                                        • Able to deliver Data Analytics projects and architecture guidelines
                                          • Strong in research, comparison and selection of tools/technologies/approaches to be used
                                            • Practical experience in performance tuning and optimization, bottleneck problem analysis
                                              • Strong communication skills, experienced in team coordination skills and solution implementation supervision
                                                • Good in Agile development methodology, Scrum in particular
                                                  • Solid skills in business analysis, network/stack architecture, troubleshooting, support
                                                    • Fluent English - B2 and higher

                                                      Technologies

                                                      • Programming Languages: Java/Scala, Python, SQL, Bash
                                                        • Big Data fundamentals
                                                          • Big Data stack: Hadoop, Yarn, HDFS, MapReduce, Hive, Spark, Kafka, Flume, Sqoop, ZooKeeper
                                                            • NoSQL: Cassandra and HBase & superstructures (Phoenix/Tephra, TitanDB/JanusGraph/Tinkerpop/Gremlin, OpenTSDB, Kylin), CosmosDB, MongoDB
                                                              • Stream processing: Kafka Streams/Spark Streaming
                                                                • Background in traditional data warehouse and business intelligence stacks (ETL, MPP Databases, Tableau, Microsoft Power BI, SAP Business Objects)
                                                                  • Data Visualization: Power BI, Tableau, QlikView
                                                                    • Operation: Cluster operation, Cluster planning
                                                                      • Flow Management: Apache Oozie, Informatica Big Data, Talend, Airflow
                                                                        • Search: Solr, Elasticsearch/ELK
                                                                          • In-Memory: Ignite, Redis
                                                                            • Cloud (at least one provider: AWS/Azure/GCP): Storage, Compute, Networking, Identity and Security, NoSQL, RDBMS and Cubes, Big Data Processing, Queues and Stream Processing, Serverless
                                                                              • Architecture concepts: Application Design, Integration Design, Layered Architecture, Synthesis of Solutions, Architecture Bugs evaluations

                                                                                We offer

                                                                                • Competitive compensation depending on experience and skills
                                                                                  • Work on enterprise-level projects on a long-term basis
                                                                                    • Full-time remote work
                                                                                      • Unlimited access to learning resources (EPAM training courses, English classes, Internal Library)
                                                                                        • Community of 38,000+ industry's top professionals
                                                                                          Big Data

                                                                                          40 hrs/week

                                                                                          Hours per week

                                                                                          12+ months

                                                                                          Project length

                                                                                          Belarus, Brazil, Georgia, Italy, Kazakhstan, Mexico, Russia, Salvador, USA, Ukraine

                                                                                          Locations eligible for the position