backgo to search

lead db engineer

bullets
Data Software Engineering, Amazon Web Services, Apache Airflow, Apache Spark, CI/CD, Python, SQL

We are seeking a highly skilled remote Lead DB Engineer with Python knowledge and experience in building CI/CD pipelines.

As a Senior DB Engineer, you will be responsible for building and maintaining our data platforms and pipelines that enable us to extract insights from our data sets.

The successful candidate will have a deep understanding of data engineering, software development, and be an expert in SQL, Spark, and Python.

responsibilities
  • Partner with Product Manager, analytics, and business teams to gather and review data/reporting/analytics requirements and build trusted and scalable data models, data extraction processes, and data applications
    • Design, build, test, and maintain scalable data pipelines and microservices that source first-party and third-party datasets, store data using distributed (cloud) structures, and other applicable storage forms such as graph databases and relational NoSQL databases
      • Build high-volume, distributed, and scalable data platform capabilities and microservices to create efficient and scalable data solutions that enable data access by applications via API
        • Develop analytical data models and dashboards to serve the needs of insights and analytics
          • Constantly strive to optimize the data models to provide data with quality and trust
            • Utilize and advance continuous integration and deployment frameworks
              • Work with architecture and engineering leads to ensure quality solutions are implemented, and engineering standard methodologies adhered to
                • Research, evaluate, and utilize new technologies, tools, and frameworks centered around high-volume data processing
                  requirements
                  • 5+ years of database engineering and software development experience
                    • 1+ years of relevant leadership experience
                      • Proficient in database schema design, analytical, and operational data modeling
                        • Proven experience working with large datasets and big data ecosystems for computing (Spark, Kafka, Hive, or similar), orchestration tools (Airflow, Oozie, Luigi), and storage (S3, Hadoop, DBFS)
                          • Complete understanding of microservices guiding principles to design and build restful APIs to allow other systems to be integrated and consume data off of data repositories (Data Lake, SSOT/MDM)
                            • Experience with modern databases (Redshift, Dynamo DB, Mongo DB, Postgres, or similar).
                              • Proficient in one or more programming languages such as Java, Python, Scala, etc., and rock-solid SQL skills
                                • Champions automated builds and deployments using CI/CD tools like Bitbucket, Git, Terraform
                                  • Experience in developing, publishing, and maintaining sophisticated reports/dashboards/visualization using tools like Tableau, Redash, or d3js
                                    • Strong hands-on experience in designing and developing curated datasets for data science and machine learning
                                      • Prior experience working with Analytics and with Data Science teams
                                        • Proven analytical, communication, and organizational skills and the ability to prioritize multiple tasks at a given time
                                          • B2+ English level

                                            benefits for locations

                                            location.svg
                                            For you
                                            • Insurance Coverage 
                                            • Paid Leaves – including maternity, bereavement, paternity, and special COVID-19 leaves. 
                                            • Financial assistance for medical crisis 
                                            • Retiral Benefits – VPF and NPS 
                                            • Customized Mindfulness and Wellness programs 
                                            • EPAM Hobby Clubs
                                            For your comfortable work
                                            • Hybrid Work Model 
                                            • Soft loans to set up workspace at home 
                                            • Stable workload 
                                            • Relocation opportunities with ‘EPAM without Borders’ program

                                            For your growth
                                            • Certification trainings for technical and soft skills 
                                            • Access to unlimited LinkedIn Learning platform 
                                            • Access to internal learning programs set up by world class trainers 
                                            • Community networking and idea creation platforms 
                                            • Mentorship programs 
                                            • Self-driven career progression tool

                                            get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
                                            Girl in front of laptop
                                            looking for something else?

                                            Find a vacancy that works for you. Send us your CV to receive a personalized offer.