We are actively seeking an experienced remote Middle Java Engineer to join our team for an intriguing project within the Big Data realm.
As a Java Engineer, your role will involve the development of APIs and their integration with Big Data technologies, such as Apache Hadoop, Hive, and Spark. If you are seeking to employ your Java skills in an unconventional and challenging project, this is a prime opportunity for you.
Design and develop RESTful APIs using Java and AWS services
Merge APIs with Big Data technologies, like Apache Hadoop, Hive, and Spark
Follow best practices and coding standards while writing efficient and maintainable code
Develop unit tests and integration tests to ensure the high quality of the Java applications
Collaborate with cross-functional teams to interpret business requirements and transform them into technical solutions
Actively participate in code reviews and provide constructive feedback to enhance the quality of the code
A minimum of 2 years of practical experience in Java development
Extensive experience in the development and deployment of RESTful APIs using Java
Familiarity with AWS and various services like EC2, S3, and Lambda
Good understanding of Big Data technologies such as Apache Hadoop, Hive, and Spark
Profound knowledge of database technologies such as MySQL, Oracle, and MongoDB
Experience in creating unit tests and integration tests for Java applications
Proficiency in using Git for version control
Knowledge of Spring and microservices
Excellent English verbal and written communication skills, at a B2+ level
We are looking for a Senior Java Engineer versed in Big Data to join our remote team. The project involves the development of APIs and services within the Big Data arena.
As a Senior Java Engineer, you will be essential in designing and implementing scalable, high-performance applications that seamlessly integrate with Amazon Web Services (AWS). Your work will involve Apache Hadoop, Apache Hive, and Apache Spark to construct and maintain complex data pipelines tailored to our client's needs.
Design and put into operation scalable, high-performance applications leveraging Java and related technologies
Develop and oversee complex data pipelines utilizing Hadoop, Hive, and Spark
Collaborate with cross-functional teams to architect and implement RESTful APIs and service-oriented architectures
Leverage AWS services like EC2, S3, and Lambda to develop and roll out cloud-based applications
Write automated tests and run code reviews to ensure high-quality code
Participate in Agile development methodologies, such as Scrum and Kanban, to achieve on-time delivery of high-quality software
At least 3 years of professional experience as a Java Software Engineer
Profound understanding of Amazon Web Services, including EC2, S3, and Lambda
Familiarity with Big Data technologies, namely Hadoop, Hive, and Spark
Experience with RESTful API development and service-oriented architectures
Proficiency in Spring and microservices
Experience with Agile development methodologies, including Scrum and Kanban
Excellent communication and collaboration skills
Proficiency in English should be at least at a B2+ level
We are seeking a skilled Middle Java Developer with a knack for Big Data to be part of our remote team.
In this role, your task will involve working on an exciting project merging conventional Java engineering with APIs, AWS, and more, all within the Big Data domain. You’ll be utilising the most recent technology stack, including Apache Hadoop, Apache Hive, and Apache Spark, to create high-performance systems that process and analyse vast quantities of data.
Sketch, create, and maintain high-performance Java applications that process and analyse colossal volumes of data
Collaborate with cross-functional teams to formulate and implement software solutions catering to the needs of our clients
Head complex and vaguely defined projects, offering mentorship and direction to junior engineers
Constantly liaise with tech leadership within the organisation to guarantee alignment on project goals and timelines
Consistently boost the quality of our software by applying best practices for testing, code reviews, and documentation
A minimum of 2 years of experience in Java software development
Hands-on experience with Amazon Web Services (AWS) for cloud computing
Solid understanding of Big Data technologies, including Apache Hadoop, Apache Hive, and Apache Spark
Experience with data processing and analysis using Java
Profound knowledge of software development best practices, which includes Agile methodologies and code reviews
Strong verbal and written English communication skills at a B2 level
We are on the lookout for a Senior Java Developer to join our dynamic remote team.
This role entails spearheading complex and demanding projects involving Big Data tools and technologies such as Apache Hadoop, Apache Hive, and Apache Spark. As part of a team of profoundly skilled professionals, you will design, develop, and launch high-performance, scalable, and dependable software systems. The ideal candidate is a high-performing and inspirational individual contributor who exemplifies leadership and continuously mentors junior engineers. Strong communication and interpersonal skills are prerequisites, as well as comfort in regularly interfacing with tech leadership within the local organization.
Head intricate and demanding projects involving Big Data tools and technologies such as Apache Hadoop, Apache Hive, and Apache Spark
Sketch, develop, and deploy high-performance, scalable, and trustworthy software systems using Java
Offer mentorship to junior engineers and exemplify leadership
Consistently interface with tech leadership within the local organisation
Cooperate with cross-functional teams to design and implement RESTful APIs using Java
Draft clean, maintainable, and scalable code
Guarantee the quality, performance, and scalability of the software systems
Minimum of 3 years of experience in Java software development
Strong experience in developing and deploying applications on Amazon Web Services (AWS)
Solid understanding of Big Data technologies such as Apache Hadoop, Apache Hive, and Apache Spark
Experience in designing and implementing RESTful APIs using Java
Ability to write clean, maintainable, and scalable code
Excellent English verbal and written communication skills should be at minimum at a B2 level