Skip To Main Content
backgo to search

senior java developer

hot
bullets
Java, Spring, Apache Beam, Google Cloud Platform, Google Cloud Dataflow

We are looking for a remote Senior Java Developer with expertise in Apache Beam and Google Cloud Platform to join our team. You will be responsible for designing and implementing robust data pipelines using Java and Apache Beam in the Google DataFlow environment to process large volumes of data efficiently. You will work with the Spring Framework to develop microservices that support our pipelines.

responsibilities
  • Design and implement robust data pipelines using Java and Apache Beam in the Google DataFlow environment to process large volumes of data efficiently
  • Work with the Spring Framework. It is microservices that support our pipelines
  • Optimize existing data pipelines for performance and scalability, identify bottlenecks, and implement improvements
  • Utilize Google Cloud Platform services to deploy and manage data pipelines, ensuring high availability and reliability
  • Ensure data integrity and compliance with data governance and security policies throughout the data processing lifecycle
  • Collaborate with data scientists and analysts to understand data requirements and implement solutions that support data modeling, mining, and extraction processes
  • Develop and maintain documentation for data pipeline architectures, design decisions, and operational procedures
  • Monitor pipeline performance and implement logging and alerting mechanisms to detect and address issues proactively
  • Stay updated with the latest data processing technologies and framework advancements, exploring new tools and practices to enhance pipeline efficiency and functionality
requirements
  • 3+ years of experience in data pipeline development, with a strong background in Java and cloud-based data processing technologies
  • Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or a related field
  • Experience of designing, implementing, and optimizing data pipelines for processing large data sets in a cloud environment
  • Strong knowledge of Google Cloud DataFlow (must have) and Apache Beam for building and managing data pipelines. Familiarity with the principles of parallel processing and distributed computing as applied to data processing
  • Experience with GCP services, particularly those related to data storage, processing, and analytics. Knowledge of GCP's infrastructure and security best practices
  • Solid understanding of the Spring Framework, including Spring Boot, for building high-performance applications
  • Proficiency in using version control systems, such as Git, for code management and collaboration
  • Ability to work in a fast-paced, collaborative environment
  • Excellent analytical, problem-solving, and communication skills
  • B2+ English level 

benefits for locations

colombia.svg
For you
  • Prepaid Medicine with Colsanitas for you and your legal dependents 
  • MetLife Life Insurance for you 
  • Thousands of projects for top brands
  • Stable income
For your comfortable work
  • 100% remote work forever
  • Free licensed software
  • Possibility to work on your own device (BYOD)
  • Stable workload
  • Flexible engagement models
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Support from a personal Skill Advisor
  • Language courses
  • Free access to internal and external e-Libraries
  • Access to internal communities and competency centers
  • Certification opportunities
don't have time? Apply later!We send you a link to the job in your e-mail
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses