backgo to search

lead big data engineer

bullets
Data Software Engineering, Apache Spark, Databricks, Python, Scala

We are seeking a highly skilled remote Lead Big Data Engineer to join our team, working on a cutting-edge project that leverages the latest technologies to deliver innovative solutions in the field of data engineering.

In this role, you will lead a team of engineers in the development, implementation, and maintenance of large-scale data processing systems, ensuring the highest levels of performance, reliability, and scalability. If you are a creative problem solver with a passion for data engineering and a proven track record of delivering complex projects, we invite you to apply for this exciting opportunity.

responsibilities
  • Lead a team of engineers in the design, development, and maintenance of large-scale data processing systems
  • Collaborate with stakeholders to identify business requirements and translate them into technical specifications
  • Develop and implement data processing pipelines using technologies such as Apache Spark and Databricks
  • Optimize data processing performance and scalability, ensuring high levels of reliability and availability
  • Provide technical guidance and mentorship to team members, fostering a culture of continuous learning and development
  • Conduct code reviews and ensure adherence to coding standards and best practices
  • Stay up-to-date with emerging trends and technologies in data engineering, and make recommendations for their adoption
requirements
  • A minimum of 5 years of experience in Data Software Engineering, with expertise in designing and building large-scale data processing systems
  • A minimum of 1 year of experience in leading a team of engineers, providing technical guidance and mentorship, and driving successful project outcomes
  • Expertise in Apache Spark and Databricks, showcasing your proficiency in distributed computing and data processing
  • In-depth knowledge of programming languages such as Python and Scala, enabling you to write efficient and effective code
  • Strong understanding of data modeling and database design principles, including experience with SQL and NoSQL databases
  • Experience with cloud-based data processing platforms such as AWS, Azure, or Google Cloud Platform
  • Excellent communication skills and the ability to work collaboratively with cross-functional teams
  • Fluent spoken and written English at an Upper-Intermediate level or higher
  • Spanish language proficiency is a must-have
nice to have
  • Completed Databricks Data Engineering Professional certification
  • Experience with machine learning and data analytics tools and frameworks such as TensorFlow, PyTorch, or Apache Flink
  • Experience with data visualization tools such as Tableau, Power BI, or QlikView

benefits for locations

chile.svg
For you
  • Paid time off
  • Paid sick leave days
  • Medical insurance
  • Stable income
  • Lunch allowance
  • Bonuses for holidays celebration
For your comfortable work
  • Remote-forever work
  • Free licensed software
  • Stable workload
  • Relocation opportunities
  • Flexible engagement models
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Language courses
  • Free access to internal and external e-Libraries
  • Certification opportunities
  • Skill advisory service
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses