Senior Big Data Engineer (Java Focus) at Global Relay
Global Relay has set the standard in enterprise information archiving for over 20 years, delivering cloud archiving, surveillance, eDiscovery, and analytics solutions that give highly regulated firms greater visibility and control over their data while ensuring compliance with stringent regulations.
Your Role
Joining the Reporting product line, you will work as a member of a highly focused team that specializes in Java-based data engineering, designing and delivering large-scale ELT/ETL workflows on a data lake house platform. You will work with modern big data technologies to move, transform, and optimize data for high-performance analytics and regulatory reporting. The environment encourages autonomy, problem solving, and system-level thinking.
Tech Stack
- Micro‑services Container Platforms: Kubernetes, CRC, Docker
- Big Data Technologies: Apache Spark, Flink, Hadoop, Airflow, Trino, Iceberg
- Dependency injection frameworks: Spring
- Observability: Loki/Grafana
- Large‑scale data processing: Kafka
- CI/CD Build tools: Maven, Git, Jenkins, Ansible
- NoSQL Databases: Cassandra, Zookeeper, HBase
Your Responsibilities
- Develop ETL, ELT and streaming processes using big data frameworks primarily in Java
- Design, implement and provide architectural guidance in deploying microservices as part of an agile development team
- Write unit and integration tests for your Java code
- Collaborate with testers in development of functional test cases
- Develop deployment systems for Java‑based systems
- Collaborate with product owners on user story generation and refinement
- Monitor and support the operation of production systems
- Participate in knowledge sharing activities with colleagues
- Engage in pair programming and peer reviews
About you
Required Experience
- Minimum 5 years of Java development experience in an Agile environment, building scalable applications and services with a focus on big data solutions and analytics
- At least 3 years of experience in developing ETL/ELT processes using relevant technologies and tools
- Experience working with data lakes and data warehouse platforms for batch and streaming data sources
- ANSI SQL or other SQL dialect experience
- Experience processing unstructured, semi‑structured and structured data
- Good understanding of ETL/ELT principles, best practices and patterns\
- Experience with big data technologies such as Hadoop, Spark and Flink, and web services technologies
- Experience with Test‑Driven Development and CI/CD pipelines
Attributes
- Good communication skills
- solving
- Self‑starter
- Team player
What you can expect
At Global Relay, there is no ceiling to what you can achieve. You will receive the mentoring, coaching, and support needed to reach your career goals. The culture promotes creativity and rewards perseverance and hard work, and you will work alongside talented individuals from diverse backgrounds.
Global Relay is an equal‑opportunity employer committed to diversity, equity, and inclusion. We seek to ensure reasonable adjustments, accommodations, and personal time are tailored to meet the unique needs of every individual.