Be at the heart of actionFly remote-controlled drones into enemy territory to gather vital information.

Apply Now

Data Engineer - Crypto Algorithm Execution

Gunvor Group
City of London
1 week ago
Create job alert
Data Engineer - Crypto Algorithm Execution

Join Gunvor Group as a Data Engineer focused on building and operating a production-grade crypto execution stack that trades efficiently across fragmented liquidity. You will deliver robust reference pricing, smart order placement, and high‑availability systems that minimize costs and enable reliable, scalable execution.


Job Description

Design, implement, and maintain an execution algorithm for major crypto exchanges such as Kraken and Binance. Develop fair value models that aggregate prices across venues to produce reliable reference pricing. Optimize order placement strategies including slicing and liquidity seeking to reduce slippage and transaction costs. Collaborate with data engineers and quantitative researchers to integrate live market feeds and trading signals into the execution framework. Build monitoring and diagnostics for execution performance, latency, and risk. Ensure the codebase is resilient, production‑ready, and highly available under live trading conditions. Document design choices and provide clear technical support for internal stakeholders.


Main Responsibilities
Execution Algorithm

  • Design, implement, and maintain an execution algorithm on crypto exchanges (Kraken, Binance, etc.) capable of trading efficiently across fragmented liquidity pools.
  • Develop fair value models that aggregate prices across multiple exchanges, ensuring robust reference pricing for execution.
  • Optimize order placement strategies (e.g., slicing, liquidity‑seeking logic) to minimize slippage and transaction costs.
  • Collaborate with data engineers and quantitative researchers to integrate live market feeds and trading signals into the execution framework.
  • Build monitoring and diagnostic tools for execution performance, latency, and risk exposure.
  • Ensure code is production‑ready, resilient, and can run with high availability under live trading conditions.
  • Document design choices and provide technical support for internal stakeholders.

Strategic Collaboration & Business Alignment

  • Translate trading objectives into clear execution KPIs such as slippage, fill rate, and venue hit ratio.
  • Partner with trading and risk to prioritize venue coverage, product scope, and rollout timelines.
  • Align roadmap with compliance and security requirements for regulated and high‑risk venues.
  • Run quarterly business reviews that connect execution performance to P&L impact and costs.
  • Coordinate with data engineering to ensure feed quality and data contracts meet execution needs.
  • Maintain a venue scorecard that ranks exchanges on liquidity, reliability, and fee structure.
  • Define success criteria for new features and validate outcomes with A/B or canary rollouts.
  • Communicate trade‑offs and recommendations to leadership using clear, data‑backed narratives.

Innovation & Product Development

  • Prototype and ship smart order logic including slicing, liquidity seeking, and venue selection.
  • Build fair value and microprice models that adapt to regime shifts and market stress.
  • Introduce latency‑aware routing that accounts for network jitter and venue throttling.
  • Incorporate fee and rebate modeling to optimize net execution price across venues.
  • Experiment with reinforcement or bandit strategies for dynamic parameter tuning.
  • Establish a rapid experiment framework with guardrails and automatic rollbacks.
  • Collaborate with quants to integrate signal strength into execution aggressiveness.
  • Maintain a backlog of validated ideas with expected impact, complexity, and dependencies.

Mentorship & Technical Oversight

  • Set coding standards with strong focus on TDD, linters, and reproducible builds.
  • Review designs and PRs for correctness, failure modes, and performance characteristics.
  • Pair with engineers on tricky components such as stateful order handling and retry logic.
  • Provide clear API guidelines for order entry, market data access, and diagnostics.
  • Run post‑incident debriefs that turn issues into actionable learning and patterns.
  • Create onboarding playbooks and sample runs that shorten time to first production change.
  • Define competency matrices and personalised growth plans for the team.
  • Curate knowledge through living docs, design records, and architecture diagrams.

Operational Excellence

  • Enforce SLOs for latency, error rate, and availability with visible dashboards and alerts.
  • Implement chaos and failure injection to validate retry, circuit breaker, and fallback paths.
  • Automate release pipelines with staged environments and progressive delivery.
  • Establish venue‑specific health checks for gateways, throttles, and rate‑limit budgets.
  • Track execution quality with real‑time TCA including slippage, mark‑outs, and reject analysis.
  • Build runbooks for incident response, venue outages, and abnormal market conditions.
  • Harden infrastructure with redundancy, idempotent handlers, and durable state storage.
  • Continuously reduce toil through automation, self‑healing jobs, and configuration as code.

Profile

  • Bachelor’s degree or higher in Computer Science, Engineering, Applied Mathematics, or a related field.
  • 3+ years’ relevant experience in data engineering, trading systems, or financial technology.
  • Proven experience designing or implementing trading execution algorithms, preferably in crypto or other electronic markets.
  • Solid understanding of order book dynamics, liquidity, and market microstructure.
  • Familiarity with exchange APIs (REST, WebSocket) and low‑latency system design.

Skills

  • Microsoft Azure: AKS, Event Hubs, Service Bus, Functions, VM Scale Sets, VNet and Private Link, Managed Identity, Key Vault. IaC with Bicep or Terraform. Monitoring with Azure Monitor, Log Analytics, and Application Insights.
  • Python: async IO with asyncio, FastAPI for low‑latency services, pandas and NumPy for data handling, pydantic for data contracts, pytest and hypothesis for tests, packaging and type hints for maintainability, performance tuning with profiling and numba.
  • And/or C++: low‑latency networking and market connectivity, concurrency and lock‑free patterns, memory management and cache‑aware design, gRPC and protobuf, build and tooling with CMake, sanitizers, clang‑tidy, careful ABI discipline.
  • And/or C#: .NET 8 services using async and Channels, high‑throughput order and market data handlers, Azure SDK integration, minimal APIs or ASP.NET Core, profiling with dotTrace and PerfView, resilience patterns with Polly, structured logging with Serilog.
  • And/or Java and Scala: JVM tuning and GC optimisation, Netty‑based or Vert.x services, Akka or Kafka Streams for streaming, Kafka clients and schema registry, testing with JUnit and ScalaTest, build systems with Gradle or Maven.
  • Data and API contracts: protobuf and gRPC for low‑latency RPC, Avro or JSON Schema for events, schema evolution and versioning, strong typing across languages, code generation pipelines that keep Python, C++, C#, Java, and Scala in sync.
  • DevOps and quality on Azure: containerisation with Docker, CI and CD with GitHub Actions or Azure DevOps, security scanning and SAST, secrets management with Key Vault, observability with OpenTelemetry and distributed tracing, canary releases and automated rollbacks.

Additional Skills

  • Highly numerate, rigorous, and resilient in problem‑solving.
  • Ability to prioritise, multitask, and deliver under time constraints.
  • Strong written and verbal communication in English.
  • Self‑motivated, proactive, and detail‑oriented.
  • Comfortable working under pressure in a fast‑paced environment.
  • Excellent communication skills, ability to explain technical topics clearly.
  • Team player with ability to collaborate across engineering, quant, and trading teams.

Our people make all the difference in our success.


If you think the open position you see is right for you, we encourage you to apply!


#J-18808-Ljbffr

Related Jobs

View all jobs

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Data Science Recruitment Trends 2025 (UK): What Job Seekers Need To Know About Today’s Hiring Process

Summary: UK data science hiring has shifted from title‑led CV screens to capability‑driven assessments that emphasise rigorous problem framing, high‑quality analytics & modelling, experiment/causality, production awareness (MLOps), governance/ethics, and measurable product or commercial impact. This guide explains what’s changed, what to expect in interviews & how to prepare—especially for product/data scientists, applied ML scientists, decision scientists, econometricians, growth/marketing analysts, and ML‑adjacent data scientists supporting LLM/AI products. Who this is for: Product/decision/data scientists, applied ML scientists, econometrics & causal inference specialists, experimentation leads, analytics engineers crossing into DS, ML generalists with strong statistics, and data scientists collaborating with platform/MLOps teams in the UK.

Why Data Science Careers in the UK Are Becoming More Multidisciplinary

Data science once meant advanced statistics, machine learning models and coding in Python or R. In the UK today, it has become one of the most in-demand professions across sectors — from healthcare to finance, retail to government. But as the field matures, employers now expect more than technical modelling skills. Modern data science is multidisciplinary. It requires not just coding and algorithms, but also legal knowledge, ethical reasoning, psychological insight, linguistic clarity and human-centred design. Data scientists are expected to interpret, communicate and apply data responsibly, with awareness of law, human behaviour and accessibility. In this article, we’ll explore why data science careers in the UK are becoming more multidisciplinary, how these five disciplines intersect with data science, and what job-seekers & employers need to know to succeed in this transformed field.

Data Science Team Structures Explained: Who Does What in a Modern Data Science Department

Data science is one of the most in-demand, dynamic, and multidisciplinary areas in the UK tech and business landscape. Organisations from finance, retail, health, government, and beyond are using data to drive decisions, automate processes, personalise services, predict trends, detect fraud, and more. To do that well, companies don’t just need good data scientists; they need teams with clearly defined roles, responsibilities, workflows, collaboration, and governance. If you're aiming for a role in data science or recruiting for one, understanding the structure of a data science department—and who does what—can make all the difference. This article breaks down the key roles, how they interact across the lifecycle of a data science project, what skills and qualifications are typical in the UK, expected salary ranges, challenges, trends, and how to build or grow an effective team.