Lead Data Engineer - Department for Transport - G7

Manchester Digital
Hastings
4 days ago
Create job alert
Lead Data Engineer - Department for Transport - G7

£57,515 - £80,400 Plus an additional DDaT allowance up to: £22,885
Published on


Full-time (Permanent) £57,515 - £80,400 Plus an additional DDaT allowance up to: £22,885
Published on 22 January 2026 Deadline 8 February 2026


Location

Birmingham, Hastings, Leeds, Swansea


About the job
Job summary

Can you lead secure, production-grade data pipelines on GCP while balancing live operations and innovation?


Do you enjoy mentoring engineers and translating complex data engineering concepts for diverse stakeholders?


If so, we'd love to hear from you!


In recent years DfT’s digital and data teams have implemented a range of advanced data services, making use of the latest cloud technologies to deliver the services and platforms that our users need, with excellent customer satisfaction rates. We are proud of our ability develop and grow as a team, and we look forward to you sharing that sense of pride!


At DfT, we recognise that everyone has different needs and aspirations. We have created an inclusive and welcoming working environment so you can feel comfortable to be yourself at work. We’ll help you to reach your full potential, offering rewarding opportunities alongside access to the latest training and technologies.


Joining our department comes with many benefits, including:



  • Employer pension contribution of 28.97% of your salary. Read more about Civil Service Pensions here
  • 25 days annual leave, increasing by 1 day each year of service (up to a maximum of 30 days annual leave), plus 8 bank holidays a privilege day for the King’s birthday
  • Flexible working options where we encourage a great work-life balance.

Read more in the Benefits section below!


Working as part of a talented and collaborative team, you will:



  • Lead the build and operation of DfT’s production‑grade data pipelines and platforms, ensuring reliability and security across our Google Cloud Platform environment.
  • Own and manage live data services, triaging and resolving issues at pace to maintain high‑quality data delivery for analysts, policy teams and external commitments.
  • Drive innovation within data engineering, identifying opportunities to modernise tooling, adopt emerging GCP capabilities and introduce new approaches that improve efficiency and data quality.
  • Plan delivery across legacy migration, operational support and new development, ensuring that resources are allocated effectively and that risks, dependencies and priorities are well managed.
  • Work closely with technical and non-technical stakeholders, translating technical concepts, shaping data‑related decisions, and responding to business need.
  • Line manage and develop engineers at varying levels, providing technical guidance, coaching and oversight, and fostering a culture of continuous improvement, collaboration and knowledge‑
    Drive adoption of Infrastructure as Code (IaC), establishing repeatable patterns for environments, access, and data services.
  • Lead the development of our metadata catalogue, curating business and technical metadata so users can effectively discover and use data.

In return, we can offer you:



  • access to new and emerging technologies,
  • varied projects developed in a cloud-first environment,
  • support and investment to further your training and development,
  • flexible and hybrid working supporting a healthy work-life balance,
  • industry-leading pension and employee benefits package.

For further information on the role, please read the role profile. Please note that the role profile is for information purposes only - whilst all elements are relevant to the role, they may not all be assessed during the recruitment process. This job advert will detail exactly what will be assessed during the recruitment process.


About Us

At the heart of data innovation and evolution in DfT, you will join a talented, experienced, data engineering team imagining and shaping the delivery of the next wave of data services. The team is embedded within the wider data directorate, and works alongside analysts, data scientists, architects and other engineers to deliver some of the most impactful data projects within DfT. You will support and shape various areas within the business which delivers an innovational transport policy agenda. As DfT is a cloud-only enterprise, you will develop the latest cloud solutions meeting complex digital, identity and data needs.


This role will give you the opportunity to share your experience and further develop your skills every day as you work on new and exciting projects with advanced technologies. We provide a supportive and constructive learning environment where your career growth is important.


Person specification

You will be an experienced data engineer with deep technical foundations and expertise in both Python and SQL. You will also be highly proficient in Google Cloud Platform, or an expert user of AWS or Azure with a willingness to apply your skills to a new cloud platform. You combine hands on engineering excellence with the ability to communicate complex ideas simply, engaging effectively with a wide range of technical and non technical stakeholders. You are comfortable balancing the demands of operating reliable, production grade data services with delivering innovation: shaping new approaches, modernising legacy systems, and driving improvements in data quality and tooling. Alongside this, you bring thoughtful planning and change management skills, helping the organisation evolve its data capabilities while ensuring continuity, stability, and high quality outcomes across DfT.


You will need to demonstrate the following experience:



  • Enterprise-scale delivery of robust, maintainable data pipelines in Google Cloud Platform. Including building reusable components, optimising performance and cost on cloud data platforms while meeting security, privacy and governance controls.
  • Expert user of data engineering tools including relevant languages (e.g. Python/SQL), IaC tools (e.g. terraform), GCP or equivalent cloud tooling or (e.g. BigQuery, Cloud Functions) CI/CD (e.g. Github Actions), logging and testing.
  • Setting and leading engineering standards, ensuring high‑quality coding practices, maintainable solutions, and consistent technical approaches across the team.
  • Leading Agile delivery of data engineering work, managing and operating live services at pace to maintain continuity and high‑quality data delivery into DfT, while also driving innovation by developing new data engineering approaches and patterns.
  • Stakeholder leadership and technical translation, partnering with architecture and data teams to align designs with strategy and standards, and managing change and innovation within DfT’s data landscape.


#J-18808-Ljbffr

Related Jobs

View all jobs

Lead Data Engineer

Lead Data Engineer / Architect – Databricks Active - SC Cleared

Lead Data Engineer (Azure)

Lead Data Engineer

Lead Data Engineer

Lead Data Engineer

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Data Science Jobs for Career Switchers in Their 30s, 40s & 50s (UK Reality Check)

Thinking about switching into data science in your 30s, 40s or 50s? You’re far from alone. Across the UK, businesses are investing in data science talent to turn data into insight, support better decisions and unlock competitive advantage. But with all the hype about machine learning, Python, AI and data unicorns, it can be hard to separate real opportunities from noise. This article gives you a practical, UK-focused reality check on data science careers for mid-life career switchers — what roles really exist, what skills employers really hire for, how long retraining typically takes, what UK recruiters actually look for and how to craft a compelling career pivot story. Whether you come from finance, marketing, operations, research, project management or another field entirely, there are meaningful pathways into data science — and age itself is not the barrier many people fear.

How to Write a Data Science Job Ad That Attracts the Right People

Data science plays a critical role in how organisations across the UK make decisions, build products and gain competitive advantage. From forecasting and personalisation to risk modelling and experimentation, data scientists help translate data into insight and action. Yet many employers struggle to attract the right data science candidates. Job adverts often generate high volumes of applications, but few applicants have the mix of analytical skill, business understanding and communication ability the role actually requires. At the same time, experienced data scientists skip over adverts that feel vague, inflated or misaligned with real data science work. In most cases, the issue is not a lack of talent — it is the quality and clarity of the job advert. Data scientists are analytical, sceptical of hype and highly selective. A poorly written job ad signals unclear expectations and immature data practices. A well-written one signals credibility, focus and serious intent. This guide explains how to write a data science job ad that attracts the right people, improves applicant quality and positions your organisation as a strong data employer.

Maths for Data Science Jobs: The Only Topics You Actually Need (& How to Learn Them)

If you are applying for data science jobs in the UK, the maths can feel like a moving target. Job descriptions say “strong statistical knowledge” or “solid ML fundamentals” but they rarely tell you which topics you will actually use day to day. Here’s the truth: most UK data science roles do not require advanced pure maths. What they do require is confidence with a tight set of practical topics that come up repeatedly in modelling, experimentation, forecasting, evaluation, stakeholder comms & decision-making. This guide focuses on the only maths most data scientists keep using: Statistics for decision making (confidence intervals, hypothesis tests, power, uncertainty) Probability for real-world data (base rates, noise, sampling, Bayesian intuition) Linear algebra essentials (vectors, matrices, projections, PCA intuition) Calculus & gradients (enough to understand optimisation & backprop) Optimisation & model evaluation (loss functions, cross-validation, metrics, thresholds) You’ll also get a 6-week plan, portfolio projects & a resources section you can follow without getting pulled into unnecessary theory.