senior big data engineer
Senior Big Data Engineer
We are currently looking for a remote Senior Big Data Engineer with 3+ years of experience in enterprise software development to join our team.
responsibilities
- Design and implement innovative analytical solution using Hadoop, NoSQL and other Big Data related technologies
- Work with product and engineering teams to understand requirements, evaluate new features and architecture to help drive decisions
- Perform detailed analysis of business problems and technical environments
- Participate in code review and test solutions to ensure it meets best practice specifications
- Build and foster a high-performance engineering culture, mentor team members and provide team with the tools and motivation
- Write project documentation
requirements
- Over 3 years of experience in enterprise software development
- Solid background in BigData and distributed computing for 3+ years
- Experienced and highly self-motivated professional with outstanding analytical and problem solving skills
- Able to work closely with customers and other stakeholders
- Good experience in developing BigData, highly available, largely scalable applications and systems
- Able to play mentoring role on a project and ensure that solutions meet business requirements and expectations
- Experienced in working with modern Agile developing methodologies and tools
- Advanced experience in software development with Big Data technologies (e.g. administration, configuration management, monitoring, debugging and performance tuning)
- Engineering experience and practice in Data Management, Data Storage, Data Visualization, Disaster Recovery, Integration, Operation, Security
- Experience building data ingestion pipelines, Data Warehouse or Database architecture.
- Experience with data modeling; hands-on development experience with modern Big Data components
- Cloud: experience in designing, deploying and administering scalable, available and fault tolerant systems
- Good understanding of CI/CD principles and best practices
- Analytical approach to problem; excellent interpersonal, mentoring and communication skills
- Data-oriented personality and possessing compliance awareness, such as PI, GDPR, HIPAA
- Motivated, independent, efficient and able work under pressure with a solid sense for setting priorities
- Ability to work in a fast-paced (startup like) agile development environment
- Experience in high load and IoT Data Platform architectures and infrastructures
- Experience with Containers and Resource Management systems: Docker, Kubernetes, Yarn
- Experience in direct customer communications
- Solid skills in infrastructure troubleshooting, support and practical experience in performance tuning and optimization, bottleneck problem analysis
- Experienced in different business domains
- English proficiency
- Advanced understanding of distributed computing principles
technologies
- Programming Languages: Java, Scala, Python, SQL, Bash
- Big Data stack: Hadoop, Yarn, HDFS, MapReduce, Hive, Spark, Kafka, Flume, Sqoop, ZooKeeper
- NoSQL: Cassandra/Hbase, MongoDB
- Queues and Stream processing: Kafka Streams, Spark Streaming, Event Hub, IoT Hub, Storage Queues, Service Bus, Stream Analytics
- Data Visualization: Tableau, QlikView
- ETL & Streaming Pipelines: Pentaho, Talend, Apache Oozie, Airflow, NiFi, Streamsets
- Operation: Cluster operation, Cluster planning
- Search: Solr, Elasticsearch/ELK
- InMemory: Ignite, Redis
- Cloud (AWS/Azure/GCP): Storage, Compute, Networking, Identity and Security, NoSQL, RDBMS and Cubes, Big Data Processing, Queues and Stream Processing, Serverless. Data Analysis and Visualization, ML as a service (SageMaker, Tensorflow)
- Enterprise Design Patterns (ORM, Inversion of Control etc.)
- Development Methods (TDD, BDD, DDD)
- Version Control Systems (Git, SVN)
- Testing: Component/ Integration Testing, Unit testing (JUnit)
- Deep understanding of SQL queries, joins, stored procedures, relational schemas, SQL optimization
- Experience in various messaging systems, such as Kafka, RabbitMQ
- Rest, Thrift, GRPC, SOAP
- Build Systems: Maven, SBT, Ant, Gradle
- Docker, Kubernetes, Yarn
benefits for locations
Armenia
For you
- Medical insurance package for you and your family
- Stable income
- Paid sick leave days
For your comfortable work
- 100% remote work forever
- Free licensed software
- Possibility to work on your own device (BYOD)
- Stable workload
- Relocation opportunities
- Flexible engagement models
For your growth
- Free trainings for technical, soft, and leadership skills
- Access to LinkedIn Learning platform
- Language courses
- Access to internal and external e-Libraries
- Certification opportunities
- Skill advisory service
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
looking for something else?
Find a vacancy that works for you. Send us your CV to receive a personalized offer.