Be at the heart of actionFly remote-controlled drones into enemy territory to gather vital information.

Apply Now

Data Engineer - Crypto Market Data Infrastructure

Gunvor Group
London
20 hours ago
Create job alert

Job Title:

Data Engineer - Crypto Market Data Infrastructure

Contract Type:

Time Type:

Job Description:

Build and operate a robust, low-latency, fault-tolerant market data platform that powers trading and analytics. Within our broader data engineering unit, you will champion standardization and reuse, delivering clean, consistent crypto market data and reusable APIs and blueprints that accelerate teams across the organization.

Design, build, and maintain a real-time market data pipeline. Aggregate order books, trades, and funding data from multiple exchanges into a single standardized feed, and harden it with redundancy, rigorous error handling, and validation to ensure reliability. Apply TDD and automation across ingestion, transformation, and storage.

Develop monitoring and alerts for data quality, latency, and system health.

Define reusable APIs, data contracts, and platform blueprints. Collaborate closely with fellow data engineers, developers, quants, and traders, sharing best practices, contributing to unit-wide standards, and ensuring seamless integration into execution and analytics systems.

Continuously document and improve the stack.

Main Responsibilities

Data Engineering Market Data

  • Design, build, and maintain a robust, low-latency, fault-tolerant market data pipeline.
  • Aggregate order books, trades, and funding data from multiple crypto exchanges into a single standardized feed.
  • Implement redundancy, error handling, and data validation mechanisms to ensure high reliability of live data.
  • Develop monitoring tools and alerts for data quality, latency, and system health.
  • Work closely with developers, quants, and traders to ensure seamless integration of data into execution and analytics systems.
  • Document and continuously improve data ingestion, transformation, and storage processes.


Strategic Collaboration & Business Alignment

  • Partner with trading desks, quantitative teams, and risk functions to translate business needs into data solutions that enhance decision-making and operational efficiency.
  • Act as a senior liaison between engineering and business stakeholders, ensuring alignment on data priorities and delivery timelines.
  • Prioritize a value-based backlog (e.g., faster close/settlement, improved forecast accuracy, reduced balancing penalties) and measure business impact.
  • Align data models and domain ownership with business processes (bids/offers, nominations, positions, exposures, outages).
  • Liaise with Cybersecurity, Compliance, and Legal on sector-specific controls (e.g., REMIT/NERC-CIP considerations, data retention, segregation).


Innovation & Product Development

  • Incubate and industrialize data products: curated marts, feature stores, real-time decision APIs, and event streams for forecasting and optimization.
  • Introduce modern patterns (CDC, schema evolution, Delta/Iceberg, stream–batch unification) to improve freshness and resilience.
  • Evaluate and integrate external data (weather, fundamentals, congestion, capacity postings), internal and external vendor systems (ETRM) safely and at scale.
  • Collaborate with quantitative analysts to productionize ML pipelines (forecasting load/renewables, anomaly detection, etc.. ) with monitoring and rollback.


Mentorship & Technical Oversight

  • Coach engineers through design reviews, pair programming, and clear contribution guidelines; raise the bar on code quality and documentation.
  • Lead incident reviews and architectural forums; provide pragmatic guidance on trade-offs (latency vs. cost, simplicity vs. flexibility).
  • Develop growth paths and learning plans focused on energy domain fluency and modern data engineering practices.


Operational Excellence

  • Implement robust monitoring/alerting, runbooks
  • Ensure security and compliance by design: least-privilege access, secrets management, encryption, auditability, and disaster recovery testing.


Profile

  • Bachelor’s degree or higher in Computer Science, Engineering, or related field.
  • 3+ years’ relevant experience in data engineering, trading systems, or financial technology.
  • Proven experience building and operating tick-level data pipelines for financial or crypto markets.
  • Prior experience in low-latency or high-availability systems preferred.


Skills

  • Azure: ADLS Gen2, Event Hubs, Synapse Analytics, Azure Databricks (Spark), Azure Functions, Azure Data Factory/Databricks Workflows, Key Vault, Azure Monitoring/Log Analytics; IaC with Terraform/Bicep; CI/CD with Azure DevOps or GitHub Actions.
  • Snowflake (on Azure or multi-cloud): Warehousing design, Streams & Tasks, Snowpipe/Snowpipe Streaming, Time Travel & Fail-safe, RBAC & row/column security, external tables over ADLS, performance tuning & cost governance.
  • Kafka / Streaming: Confluent Platform/Cloud, Kafka Streams/Spring Kafka, ksqlDB, Schema Registry (Avro/Protobuf), Kafka Connect (Debezium CDC), MirrorMaker 2; patterns for exactly-once/at-least-once, backpressure, and idempotency.
  • Programming & Engineering Practices: Strong OOP in Python and/or Java/Scala;
  • SDLC, DevOps mindset, TDD/BDD, code reviews, automated testing (unit/integration/contract), packaging and dependency management, API design (REST/gRPC).
  • Orchestration & Quality: Airflow/ADF/Databricks Jobs, data contracts, Great Expectations (or similar), lineage/catalog (e.g., Purview), metrics/observability (Prometheus/Grafana/Application Insights).


Additional Skills

  • Highly numerate, rigorous, and resilient in problem-solving.
  • Ability to prioritize, multitask, and deliver under time constraints.
  • Strong written and verbal communication in English.
  • Self-motivated, proactive, and detail-oriented.
  • Comfortable working under pressure in a fast-paced environment.
  • Excellent communication skills, ability to explain technical topics clearly.
  • Team player with ability to collaborate across engineering, quant, and trading teams


If you think the open position you see is right for you, we encourage you to apply!

Our people make all the difference in our success.

Related Jobs

View all jobs

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Data Science Recruitment Trends 2025 (UK): What Job Seekers Need To Know About Today’s Hiring Process

Summary: UK data science hiring has shifted from title‑led CV screens to capability‑driven assessments that emphasise rigorous problem framing, high‑quality analytics & modelling, experiment/causality, production awareness (MLOps), governance/ethics, and measurable product or commercial impact. This guide explains what’s changed, what to expect in interviews & how to prepare—especially for product/data scientists, applied ML scientists, decision scientists, econometricians, growth/marketing analysts, and ML‑adjacent data scientists supporting LLM/AI products. Who this is for: Product/decision/data scientists, applied ML scientists, econometrics & causal inference specialists, experimentation leads, analytics engineers crossing into DS, ML generalists with strong statistics, and data scientists collaborating with platform/MLOps teams in the UK.

Why Data Science Careers in the UK Are Becoming More Multidisciplinary

Data science once meant advanced statistics, machine learning models and coding in Python or R. In the UK today, it has become one of the most in-demand professions across sectors — from healthcare to finance, retail to government. But as the field matures, employers now expect more than technical modelling skills. Modern data science is multidisciplinary. It requires not just coding and algorithms, but also legal knowledge, ethical reasoning, psychological insight, linguistic clarity and human-centred design. Data scientists are expected to interpret, communicate and apply data responsibly, with awareness of law, human behaviour and accessibility. In this article, we’ll explore why data science careers in the UK are becoming more multidisciplinary, how these five disciplines intersect with data science, and what job-seekers & employers need to know to succeed in this transformed field.

Data Science Team Structures Explained: Who Does What in a Modern Data Science Department

Data science is one of the most in-demand, dynamic, and multidisciplinary areas in the UK tech and business landscape. Organisations from finance, retail, health, government, and beyond are using data to drive decisions, automate processes, personalise services, predict trends, detect fraud, and more. To do that well, companies don’t just need good data scientists; they need teams with clearly defined roles, responsibilities, workflows, collaboration, and governance. If you're aiming for a role in data science or recruiting for one, understanding the structure of a data science department—and who does what—can make all the difference. This article breaks down the key roles, how they interact across the lifecycle of a data science project, what skills and qualifications are typical in the UK, expected salary ranges, challenges, trends, and how to build or grow an effective team.