Engineer the Quantum RevolutionYour expertise can help us shape the future of quantum computing at Oxford Ionics.

View Open Roles

Senior Data Engineer

TP ICAP
London
1 week ago
Create job alert

The TP ICAP Group is a world leading provider of market infrastructure.

Our purpose is to provide clients with access to global financial and commodities markets, improving price discovery, liquidity, and distribution of data, through responsible and innovative solutions.

Through our people and technology, we connect clients to superior liquidity and data solutions.

The Group is home to a stable of premium brands. Collectively, TP ICAP is the largest interdealer broker in the world by revenue, the number one Energy & Commodities broker in the world, the world's leading provider of OTC data, and an award winning all-to-all trading platform.

Founded in London in 1866, the Group operates from more than 60 offices in 27 countries. We are 5,200 people strong. We work as one to achieve our vision of being the world's most trusted, innovative, liquidity and data solutions specialist.

Role Overview

This is a Senior Data Engineer role that sits within the Brokerage & Pricing team within the TP ICAP Technology division. The Senior Data engineer will join an Agile team alongside other engineers, working on the next generation of strategic back office applications, ensuring solutions provide maximum value to users. The team's focus on Brokerage & Pricing technology is to optimise the management of brokerage data and calculations used to drive all broking activity in our £1billion+ revenue Global Broking organisation, and carrying out commercial analysis on that data to understand revenues and drive client commercial agreements. The Senior Data Engineer is responsible for designing, developing, and maintaining data pipelines and ETL processes to support data integration and analytics. This role requires a deep understanding of data structures and content, ensuring high-quality data through rigorous testing and validation. The engineer collaborates with system owners and stakeholders to understand data requirements, delivering reliable and efficient data solutions. Attention to detail and a commitment to data quality are paramount in maintaining the integrity and reliability of data.

Role Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes to support data integration and analytics.
  • Code primarily in Python to build and optimise data workflows.
  • Implement and manage workflows using Apache Airflow (MWAA).
  • Ensure high-quality data through rigorous testing and validation processes.
  • Produce data quality reports to monitor and ensure the integrity of data.
  • Conduct thorough data exploration and analysis to understand data structure and content before developing ETL pipelines.
  • Collaborate with system owners and stakeholders to understand data requirements and deliver solutions.
  • Monitor and troubleshoot data pipelines to ensure reliability and track performance.
  • Maintain detailed documentation of data processes, workflows, and system configurations.
  • Familiar with data lakes and their architecture.

Experience / Competences

  • Strong experience as a Data Engineer, preferably in the finance sector.
  • Strong understanding of ETL processes and data pipeline design.
  • Extensive experience coding in Python.
  • Hands-on experience with Apache Airflow (MWAA) for workflow management.
  • Experience with AWS Athena/PySpark (Glue) for data querying and processing.
  • Strong SQL/PLSQL skills, particularly with MS SQL and Oracle databases.
  • Highly proficient in SQL with experience in both relational and NoSQL databases.
  • Attention to detail and the ability to work under pressure without being distracted by complexity.
  • Excellent problem-solving skills and the ability to think critically and creatively.
  • Strong collaboration skills and the ability to communicate effectively with team members and stakeholders.
  • Passion for data quality and a commitment to maintaining high standards of data engineering.
  • Proficiency in Python for data engineering tasks.
  • proficiency in using AWS Cloud services in the context of data processing.
  • Strong understanding of ETL processes and data pipeline design.
  • Familiarity with data lakes, operational databases/data stores and their architecture.
  • Fluent in using Python CDK for AWS.
  • Familiarity with version control systems (e.g., Git) and backlog management tools (e.g., JIRA).
  • Ability to write clear and concise documentation.
  • Strong communication skills, both written and verbal.
  • Ability to work effectively as part of a team and independently when required.

Job Band & Level

  • Manager / Level 6

#LI-Hybrid #LI-MID

Not The Perfect Fit?

Concerned that you may not meet the criteria precisely? At TP ICAP, we wholeheartedly believe in fostering inclusivity and cultivating a work environment where everyone can flourish, regardless of your personal or professional background. If you are enthusiastic about this role but find that your experience doesn't align perfectly with every aspect of the job description, we strongly encourage you to apply. You may be the ideal candidate for this position or another opportunity within our organisation. Our dedicated Talent Acquisition team is here to assist you in recognising how your unique skills and abilities can be a valuable contribution. Don't hesitate to take the leap and explore the possibilities. Your potential is what truly matters to us.

Company Statement

We know that the best innovation happens when diverse people with different perspectives and skills work together in an inclusive atmosphere. That's why we're building a culture where everyone plays a part in making people feel welcome, ready and willing to contribute. TP ICAP Accord - our Employee Network - is a central to this. As well as representing specific groups, TP ICAP Accord helps increase awareness, collaboration, shares best practice, and holds our firm to account for driving continuous cultural improvement.

Location
UK - 135 Bishopsgate - London
#J-18808-Ljbffr

Related Jobs

View all jobs

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Pre-Employment Checks for Data Science Jobs: DBS, References & Right-to-Work and more Explained

Pre-employment screening in data science reflects the discipline's unique position at the intersection of statistical analysis, machine learning innovation, and strategic business intelligence. Data scientists often have privileged access to comprehensive datasets, proprietary algorithms, and business-critical insights that form the foundation of organisational strategy and competitive positioning. The data science industry operates within complex regulatory frameworks spanning GDPR, sector-specific data protection requirements, and emerging AI governance regulations. Data scientists must demonstrate not only technical competence in statistical modelling and machine learning but also deep understanding of research ethics, data privacy principles, and the societal implications of algorithmic decision-making. Modern data science roles frequently involve analysing personally identifiable information, financial data, healthcare records, and sensitive business intelligence across multiple jurisdictions and regulatory frameworks simultaneously. The combination of analytical privilege, predictive capabilities, and strategic influence makes thorough candidate verification essential for maintaining compliance, security, and public trust in data-driven insights and automated systems.

Why Now Is the Perfect Time to Launch Your Career in Data Science: The UK's Analytics Revolution

The United Kingdom stands at the forefront of a data science revolution that's reshaping how businesses make decisions, governments craft policies, and society tackles its greatest challenges. From the machine learning algorithms powering London's fintech innovation to the predictive models guiding Manchester's smart city initiatives, Britain's transformation into a data-driven economy has created an unprecedented demand for skilled data scientists that far outstrips the available talent. If you've been contemplating a career transition or seeking to position yourself at the cutting edge of the digital economy, data science represents one of the most intellectually stimulating, financially rewarding, and socially impactful career paths available today. The convergence of big data maturation, artificial intelligence mainstream adoption, business intelligence evolution, and cross-industry digital transformation has created the perfect conditions for data science career success.

Automate Your Data Science Jobs Search: Using ChatGPT, RSS & Alerts to Save Hours Each Week

Data science roles land daily across banks, product companies, consultancies, scaleups & the public sector—often buried in ATS portals or duplicated across boards. The fix: put discovery on rails with keyword-rich alerts, RSS feeds & a reusable ChatGPT workflow that triages listings, ranks fit, & tailors your CV in minutes. This copy-paste playbook is for www.datascience-jobs.co.uk readers. It’s UK-centric, practical, & designed to save you hours each week. What You’ll Have Working In 30 Minutes A role & keyword map spanning Core DS, Applied/Research, Product/Decision Science, NLP/CV, Causal/Experimentation, Time Series/Forecasting, MLOps-adjacent & Analytics Engineering overlaps. Shareable Boolean searches for Google & job boards that strip out noise. Always-on alerts & RSS feeds that bring fresh UK roles to you. A ChatGPT “Data Science Job Scout” prompt that deduplicates, scores match & outputs ready-to-paste actions. A simple pipeline tracker so deadlines & follow-ups never slip.