senior big data software engineer for a software company
We are currently looking for a remote Senior Big Data Engineer with 7+ years of relevant experience in one of the following areas: Big Data Engineering, Datawarehouse, Business Intelligence or Business Analytics to join our team.
The customer is a provider of software as a service and cloud-based remote connectivity services for collaboration, IT management and customer engagement. The company's products give users and administrators access to remote computers.
- Apply your broad knowledge of technology options, technology platforms, design techniques and approaches across the data warehouse lifecycle phases to design an integrated quality solution that address requirements
- Ensure completeness and compatibility of the technical infrastructure required to support system performance, availability and architecture requirements
- Design and plan for the integration for all data warehouse technical components
- Provide input and recommendations on technical issues to the team
- Responsible for data design, data extracts and transforms
- Develop implementation and operation support plans
- Lead architecture design and implementation of next generation BI solution
- Build robust and scalable data integration (ETL) pipelines using AWS Services, EMR, Python, PiG and Spark
- Mentor and develop other Junior Data Engineers
- Build and deliver high quality data architecture to support Business Analysts, Data Scientists and customer reporting needs
- Interface with other technology teams to extract, transform, and load data from a wide variety of data sources
- Bachelor's degree in Computer Science required; Master’s degree preferred
- 7+ years of relevant experience in one of the following areas: Big Data Engineering, Datawarehouse, Business Intelligence or Business Analytics
- 7+ years of hands-on experience in writing complex, highly optimized SQL queries across large data sets
- Demonstrated strength in data modelling, ETL development, and Data Warehousing
- Experience with AWS services including S3, EMR, Kinesis and RDS
- Experience with big data stack of technologies, including Hadoop, HDFS, Hive, Spark, Pig, Presto
- Experience with delivering end-to-end projects independently
- Experience with using AirFlow, creating and maintaining DAGs, Operators, and Hooks
- Knowledge of distributed systems as it pertains to data storage and computing
- Exceptional Problem solving and analytical skills
- Knowledge of software engineering best practices across the development lifecycle; including, Agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
- Proficient English (written and spoken) B2
looking for something else?
Find a vacancy that works for you. Send us your CV to receive a personalized offer.