middle big data software engineer for a software company
We are currently looking for a remote Middle Big Data Software Engineer with 5+ years of relevant experience in one of the following areas: Big Data Engineering, Data Warehouse, Business Intelligence or Business Analytics to join our team.
The customer is a provider of software as a service and cloud-based remote connectivity services for collaboration, IT management and customer engagement. The company's products give users and administrators access to remote computers.
- Apply your broad knowledge of technology options, technology platforms, design techniques and approaches across the data warehouse lifecycle phases to design an integrated quality solution that address requirements
- Ensure completeness and compatibility of the technical infrastructure required to support system performance, availability and architecture requirements
- Design and plan for the integration for all data warehouse technical components
- Provide input and recommendations on technical issues to the team
- Responsible for data design, data extracts and transforms
- Develop implementation and operation support plans
- Lead architecture design and implementation of next generation BI solution
- Build robust and scalable data integration (ETL) pipelines using AWS Services, EMR, Python, PiG and Spark
- Mentor and help develop other junior data engineers
- Build and deliver high quality data architecture to support business analysts, data scientists and customer reporting needs
- Interface with other technology teams to extract, transform, and load data from a wide variety of data sources
- 5+ years of relevant experience in one of the following areas: Big Data Engineering, Data Warehouse, Business Intelligence or Business Analytics
- 5+ years of hands-on experience in writing complex, highly optimized SQL queries across large data sets
- Bachelor's degree in Computer Science, though Master’s degree is preferred
- Demonstrated strength in data modelling, ETL development and Data Warehousing
- Experience with AWS services including S3, EMR, Kinesis and RDS
- Experience with big data stack of technologies, such as Hadoop, HDFS, Hive, Spark, Pig, Presto
- Experience delivering end-to-end projects independently
- Experience using AirFlow, creating and maintaining DAGs, Operators and Hooks
- Knowledge of distributed systems as it pertains to data storage and computing
- Exceptional problem solving and analytical skills
- Knowledge of software engineering best practices across the development lifecycle, including Agile methodologies, coding standards, code reviews, source management, build processes, testing and operations
- English level - B1
looking for something else?
Find a vacancy that works for you. Send us your CV to receive a personalized offer.