Be at the heart of actionFly remote-controlled drones into enemy territory to gather vital information.

Apply Now

Data Engineer - Crypto Algorithm Execution

Gunvor Group
London
17 hours ago
Create job alert

Job Title:
Data Engineer - Crypto Algorithm Execution
Contract Type:
Time Type:
Job Description:
Build and operate a production-grade crypto execution stack that trades efficiently across fragmented liquidity. You will deliver robust reference pricing, smart order placement, and high-availability systems that minimize costs and enable reliable, scalable execution.
Design, implement, and maintain an execution algorithm for major crypto exchanges such as Kraken and Binance. Develop fair value models that aggregate prices across venues to produce reliable reference pricing. Optimize order placement strategies including slicing and liquidity seeking to reduce slippage and transaction costs. Collaborate with data engineers and quantitative researchers to integrate live market feeds and trading signals into the execution framework. Build monitoring and diagnostics for execution performance, latency, and risk. Ensure the codebase is resilient, production-ready, and highly available under live trading conditions. Document design choices and provide clear technical support for internal stakeholders.
Main Responsibilities
Execution Algorithm

  • Design, implement, and maintain an execution algorithm on Crypto Exchanges (Kraken, Binance, etc) capable of trading efficiently across fragmented liquidity pools.
  • Develop fair value models that aggregate prices across multiple exchanges, ensuring robust reference pricing for execution.
  • Optimize order placement strategies (e.G., slicing, liquidity-seeking logic) to minimize slippage and transaction costs.
  • Collaborate with data engineers and quantitative researchers to integrate live market feeds and trading signals into the execution framework.
  • Build monitoring and diagnostic tools for execution performance, latency, and risk exposure.
  • Ensure code is production-ready, resilient, and can run with high availability under live trading conditions.
  • Document design choices and provide technical support for internal stakeholders.

Strategic Collaboration & Business Alignment

  • Translate trading objectives into clear execution KPIs such as slippage, fill rate, and venue hit ratio
  • Partner with trading and risk to prioritize venue coverage, product scope, and rollout timelines
  • Align roadmap with compliance and security requirements for regulated and high-risk venues
  • Run quarterly business reviews that connect execution performance to PnL impact and costs
  • Coordinate with data engineering to ensure feed quality and data contracts meet execution needs
  • Maintain a venue scorecard that ranks exchanges on liquidity, reliability, and fee structure
  • Define success criteria for new features and validate outcomes with A/B or canary rollouts
  • Communicate trade-offs and recommendations to leadership using clear, data-backed narratives

Innovation & Product Development

  • Prototype and ship smart order logic including slicing, liquidity seeking, and venue selection
  • Build fair value and microprice models that adapt to regime shifts and market stress
  • Introduce latency-aware routing that accounts for network jitter and venue throttling
  • Incorporate fee and rebate modeling to optimize net execution price across venues
  • Experiment with reinforcement or bandit strategies for dynamic parameter tuning
  • Establish a rapid experiment framework with guardrails and automatic rollbacks
  • Collaborate with quants to integrate signal strength into execution aggressiveness
  • Maintain a backlog of validated ideas with expected impact, complexity, and dependencies

Mentorship & Technical Oversight

  • Set coding standards with strong focus on TDD, linters, and reproducible builds
  • Review designs and PRs for correctness, failure modes, and performance characteristics
  • Pair with engineers on tricky components such as stateful order handling and retry logic
  • Provide clear API guidelines for order entry, market data access, and diagnostics
  • Run post-incident debriefs that turn issues into actionable learning and patterns
  • Create onboarding playbooks and sample runs that shorten time to first production change
  • Define competency matrices and personalized growth plans for the team
  • Curate knowledge through living docs, design records, and architecture diagrams

Operational Excellence

  • Enforce SLOs for latency, error rate, and availability with visible dashboards and alerts
  • Implement chaos and failure injection to validate retry, circuit breaker, and fallback paths
  • Automate release pipelines with staged environments and progressive delivery
  • Establish venue-specific health checks for gateways, throttles, and rate-limit budgets
  • Track execution quality with real-time TCA including slippage, markouts, and reject analysis
  • Build runbooks for incident response, venue outages, and abnormal market conditions
  • Harden infrastructure with redundancy, idempotent handlers, and durable state storage
  • Continuously reduce toil through automation, self-healing jobs, and configuration as code

Profile

  • Bachelor’s degree or higher in Computer Science, Engineering, Applied Mathematics, or a related field.
  • 3+ years’ relevant experience in data engineering, trading systems, or financial technology.
  • Proven experience designing or implementing trading execution algorithms, preferably in crypto or other electronic markets.
  • Solid understanding of order book dynamics, liquidity, and market microstructure.
  • Familiarity with exchange APIs (REST, WebSocket) and low-latency system design.

Skills

  • Microsoft Azure: AKS, Event Hubs, Service Bus, Functions, VM Scale Sets, VNet and Private Link, Managed Identity, Key Vault. Infrastructure as code with Bicep or Terraform. Monitoring with Azure Monitor, Log Analytics, and Application Insights.
  • Python: async IO with asyncio, FastAPI for low-latency services, pandas and NumPy for data handling, pydantic for data contracts, pytest and hypothesis for tests, packaging and type hints for maintainability, performance tuning with profiling and numba where appropriate.
  • And/Or C++: low-latency networking and market connectivity, concurrency and lock-free patterns, memory management and cache-aware design, gRPC and protobuf, build and tooling with CMake, sanitizers, and clang-tidy, careful ABI discipline for reliability.
  • And/Or C#: .NET 8 services using async and Channels, high-throughput order and market data handlers, Azure SDK integration, minimal APIs or ASP.NET Core, profiling with dotTrace and PerfView, resilience patterns with Polly, structured logging with Serilog.
  • And/Or Java and Scala: JVM tuning and GC optimization, Netty-based or Vert.X services, Akka or Kafka Streams for streaming, Kafka clients and schema registry, testing with JUnit and ScalaTest, build systems with Gradle or Maven.
  • Data and API contracts: protobuf and gRPC for low-latency RPC, Avro or JSON Schema for events, schema evolution and versioning, strong typing across languages, code generation pipelines that keep Python, C++, C#, Java, and Scala in sync.
  • DevOps and quality on Azure: containerization with Docker, CI and CD with GitHub Actions or Azure DevOps, security scanning and SAST, secrets management with Key Vault, observability with OpenTelemetry and distributed tracing, canary releases and automated rollbacks.

Additional Skills

  • Highly numerate, rigorous, and resilient in problem-solving.
  • Ability to prioritize, multitask, and deliver under time constraints.
  • Strong written and verbal communication in English.
  • Self-motivated, proactive, and detail-oriented.
  • Comfortable working under pressure in a fast-paced environment.
  • Excellent communication skills, ability to explain technical topics clearly.
  • Team player with ability to collaborate across engineering, quant, and trading teams

If you think the open position you see is right for you, we encourage you to apply!
Our people make all the difference in our success.

Related Jobs

View all jobs

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Neurodiversity in Data Science Careers: Turning Different Thinking into a Superpower

Data science is all about turning messy, real-world information into decisions, products & insights. It sits at the crossroads of maths, coding, business & communication – which means it needs people who see patterns, ask unusual questions & challenge assumptions. That makes data science a natural fit for many neurodivergent people, including those with ADHD, autism & dyslexia. If you’re neurodivergent & thinking about a data science career, you might have heard comments like “you’re too distracted for complex analysis”, “too literal for stakeholder work” or “too disorganised for large projects”. In reality, the same traits that can make traditional environments difficult often line up beautifully with data science work. This guide is written for data science job seekers in the UK. We’ll explore: What neurodiversity means in a data science context How ADHD, autism & dyslexia strengths map to common data science roles Practical workplace adjustments you can request under UK law How to talk about your neurodivergence in applications & interviews By the end, you’ll have a clearer sense of where you might thrive in data science – & how to turn “different thinking” into a real career advantage.

Data Science Recruitment Trends 2025 (UK): What Job Seekers Need To Know About Today’s Hiring Process

Summary: UK data science hiring has shifted from title‑led CV screens to capability‑driven assessments that emphasise rigorous problem framing, high‑quality analytics & modelling, experiment/causality, production awareness (MLOps), governance/ethics, and measurable product or commercial impact. This guide explains what’s changed, what to expect in interviews & how to prepare—especially for product/data scientists, applied ML scientists, decision scientists, econometricians, growth/marketing analysts, and ML‑adjacent data scientists supporting LLM/AI products. Who this is for: Product/decision/data scientists, applied ML scientists, econometrics & causal inference specialists, experimentation leads, analytics engineers crossing into DS, ML generalists with strong statistics, and data scientists collaborating with platform/MLOps teams in the UK.

Why Data Science Careers in the UK Are Becoming More Multidisciplinary

Data science once meant advanced statistics, machine learning models and coding in Python or R. In the UK today, it has become one of the most in-demand professions across sectors — from healthcare to finance, retail to government. But as the field matures, employers now expect more than technical modelling skills. Modern data science is multidisciplinary. It requires not just coding and algorithms, but also legal knowledge, ethical reasoning, psychological insight, linguistic clarity and human-centred design. Data scientists are expected to interpret, communicate and apply data responsibly, with awareness of law, human behaviour and accessibility. In this article, we’ll explore why data science careers in the UK are becoming more multidisciplinary, how these five disciplines intersect with data science, and what job-seekers & employers need to know to succeed in this transformed field.