Senior Quantitative Developer

Centrica
London
1 month ago
Applications closed

Related Jobs

View all jobs

Data Scientist

2x Senior Data Engineer (Financial Services)

Senior Data Strategy Consultant, Marketing Solutions

Data Science Consultant

Senior Data Engineer

Senior Data Engineer

The Quantitative Analytics team at Centrica Energy is part of the Trading Analytics and Algorithms centre of excellence, and is responsible for:
Delivering quantitative analysis of complex and structured products, that provides insight to help traders to maximise value and manage risk

Designing and implementing complex mathematical models that allow for consistent pricing and joint risk quantification across multiple portfolios, enabling more holistic and optimal hedging decisions

Assisting originators in development of structured products across the Renewables, LNG, Gas & Power sectors.

As a

Senior Quantitative Developer

you will become part of an agile team of circa 10 people located across our offices in both London (UK) and Aalborg (Denmark), with a broad remit to support the development, and use of, our pricing and risk management models and systems. As well as consulting on software design, systems architecture and software engineering practices.
This role will be situated in our London or Aalborg office and the successful candidate will be required to:
Produce high quality increments to the team's model library - working both individually and collaboratively.

Providing leadership in areas such as object modelling & interface design, automated testing & refactoring and performance optimisation.

Provide support to Trading & Origination on model usage and behaviour.

Leverage the teams cross-asset expertise to connect business locations, helping to identify synergies and increase efficiency.

Here's what we're looking for:
Master's Degree or PhD qualification within science, computing, mathematics or other quantitative subject.

Solid experience of code development in Python, including:
Experience developing in an Agile environment

Use of math/stats & testing libraries, as well as modern build tools.

Ability to refactor code and work effectively with legacy systems.

Knowledge of Object Orientation, Software Architecture and Design Patterns.

Good understanding of software testing practices (unit testing, integration testing, etc.). Knowledge of TDD and BDD is a strong plus.

Experience with DevOps practices and tooling (CI/CD, containerization etc) is desirable.

Familiarity with mathematical and statistical models used in finance, particularly with regards to derivatives pricing and risk management systems.

Familiarity or high level of interest in Energy / commodity markets.

Strong Communicator and fluent in English language.

Strong interpersonal skills.

#LI-CET

TPBN1_UKTJ

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

New Data Science Employers to Watch in 2026: UK and International Companies Leading Analytics and AI Innovation

Data science has emerged as one of the most transformative forces across industries, turning raw information into actionable insights, predictive models, and AI-powered solutions. In 2026, the UK is witnessing a surge in organisations where data science is not just a support function but the core of their products and services. For professionals exploring opportunities on www.DataScience-Jobs.co.uk , identifying these employers early can provide a competitive advantage in a market with high demand for advanced analytics and machine learning expertise. This article highlights new and high-growth data science employers to watch in 2026, focusing on UK startups, scale-ups, and global firms expanding their data science operations locally. All of the companies included have recently raised investment, won high-profile contracts, or significantly scaled their analytics teams.

How Many Data Science Tools Do You Need to Know to Get a Data Science Job?

If you’re trying to break into data science — or progress your career — it can feel like you are drowning in names: Python, R, TensorFlow, PyTorch, SQL, Spark, AWS, Scikit-learn, Jupyter, Tableau, Power BI…the list just keeps going. With every job advert listing a different combination of tools, many applicants fall into a trap: they try to learn everything. The result? Long tool lists that sound impressive — but little depth to back them up. Here’s the straight-talk version most hiring managers won’t explicitly tell you: 👉 You don’t need to know every data science tool to get hired. 👉 You need to know the right ones — deeply — and know how to use them to solve real problems. Tools matter, but only in service of outcomes. So how many data science tools do you actually need to know to get a job? For most job seekers, the answer is not “27” — it’s more like 8–12, thoughtfully chosen and well understood. This guide explains what employers really value, which tools are core, which are role-specific, and how to focus your toolbox so your CV and interviews shine.

What Hiring Managers Look for First in Data Science Job Applications (UK Guide)

If you’re applying for data science roles in the UK, it’s crucial to understand what hiring managers focus on before they dive into your full CV. In competitive markets, recruiters and hiring managers often make their first decisions in the first 10–20 seconds of scanning an application — and in data science, there are specific signals they look for first. Data science isn’t just about coding or statistics — it’s about producing insights, shipping models, collaborating with teams, and solving real business problems. This guide helps you understand exactly what hiring managers look for first in data science applications — and how to structure your CV, portfolio and cover letter so you leap to the top of the shortlist.