Data Engineer

Cerberus Capital Management
Slough
1 day ago
Create job alert

Data Engineer [Associate/Senior Associate]


About the job

We are looking to expand our Data Engineering team to build modern, scalable data platforms for our internal investment desks and portfolio companies. You will contribute to the firm’s objectives by delivering rapid and reliable data solutions that unlock value for Cerberus desks, portfolio companies, and other businesses. You’ll do this by designing and implementing robust data architectures, pipelines, and workflows that enable advanced analytics and AI applications. You may also support initiatives such as due diligence and pricing analyses by ensuring high-quality, timely data availability.


What you will do

  • Design, build, and maintain scalable, cloud-based data pipelines and architectures to support advanced analytics and machine learning initiatives.
  • Develop robust ELT workflows using tools like dbt, Airflow, and SQL (PostgreSQL, MySQL) to transform raw data into high-quality, analytics-ready datasets.
  • Collaborate with data scientists, analysts, and software engineers to ensure seamless data integration and availability for predictive modeling and business intelligence.
  • Optimize data storage and processing in Azure environments for performance, reliability, and cost-efficiency.
  • Implement best practices for data modeling, governance, and security across all platforms.
  • Troubleshoot and enhance existing pipelines to improve scalability and resilience.

Sample Projects You Work On

  • Financial Asset Management Pipeline: Build and manage data ingestion from third-party APIs, model data using dbt, and support machine learning workflows for asset pricing and prediction using Azure ML Studio. This includes ELT processes, data modeling, running predictions, and storing outputs for downstream analytics.


Your Experience

We’re a small, high-impact team with a broad remit and diverse technical backgrounds. We don’t expect any single candidate to check every box below - if your experience overlaps strongly with what we do and you’re excited to apply your skills in a fast-moving, real-world environment, we’d love to hear from you.

  • Strong technical foundation: Degree in a STEM field (or equivalent experience) with hands-on experience in production environments, emphasizing performance optimization and code quality.
  • Python expertise: Advanced proficiency in Python for data engineering, data wrangling and pipeline development.
  • Cloud Platforms: Hands-on experience working with Azure. AWS experience is considered, however Azure exposure is essential.
  • Data Warehousing: Proven expertise with Snowflake – schema design, performance tuning, data ingestion, and security.
  • Workflow Orchestration: Production experience with Apache Airflow (Prefect, Dagster or similar), including authoring DAGs, scheduling workloads and monitoring pipeline execution.
  • Data Modeling: Strong skills in dbt, including writing modular SQL transformations, building data models, and maintaining dbt projects.
  • SQL Databases: Extensive experience with PostgreSQL, MySQL (or similar), including schema design, optimization, and complex query development.
  • Infrastructure as Code: Production experience with declarative infrastructure definition – e.g. Terraform, Pulumi or similar.
  • Version Control and CI/CD: Familiarity with Git-based workflows and continuous integration/deployment practices (experience with Azure DevOps or Github Actions) to ensure seamless code integration and deployment processes.
  • Communication and Problem solving skills: Ability to articulate complex technical concepts to technical and non-technical stakeholders alike. Excellent problem-solving skills with a strong analytical mindset.


About Us:

We are a new, but growing team of AI specialists - data scientists, software engineers, and technology strategists - working to transform how an alternative investment firm with $65B in assets under management leverages technology and data. Our remit is broad, spanning investment operations, portfolio companies, and internal systems, giving the team the opportunity to shape the way the firm approaches analytics, automation, and decision-making.

We operate with the creativity and agility of a small team, tackling diverse, high-impact challenges across the firm. While we are embedded within a global investment platform, we maintain a collaborative, innovative culture where our AI talent can experiment, learn, and have real influence on business outcomes.

Related Jobs

View all jobs

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Data Science Jobs for Career Switchers in Their 30s, 40s & 50s (UK Reality Check)

Thinking about switching into data science in your 30s, 40s or 50s? You’re far from alone. Across the UK, businesses are investing in data science talent to turn data into insight, support better decisions and unlock competitive advantage. But with all the hype about machine learning, Python, AI and data unicorns, it can be hard to separate real opportunities from noise. This article gives you a practical, UK-focused reality check on data science careers for mid-life career switchers — what roles really exist, what skills employers really hire for, how long retraining typically takes, what UK recruiters actually look for and how to craft a compelling career pivot story. Whether you come from finance, marketing, operations, research, project management or another field entirely, there are meaningful pathways into data science — and age itself is not the barrier many people fear.

How to Write a Data Science Job Ad That Attracts the Right People

Data science plays a critical role in how organisations across the UK make decisions, build products and gain competitive advantage. From forecasting and personalisation to risk modelling and experimentation, data scientists help translate data into insight and action. Yet many employers struggle to attract the right data science candidates. Job adverts often generate high volumes of applications, but few applicants have the mix of analytical skill, business understanding and communication ability the role actually requires. At the same time, experienced data scientists skip over adverts that feel vague, inflated or misaligned with real data science work. In most cases, the issue is not a lack of talent — it is the quality and clarity of the job advert. Data scientists are analytical, sceptical of hype and highly selective. A poorly written job ad signals unclear expectations and immature data practices. A well-written one signals credibility, focus and serious intent. This guide explains how to write a data science job ad that attracts the right people, improves applicant quality and positions your organisation as a strong data employer.

Maths for Data Science Jobs: The Only Topics You Actually Need (& How to Learn Them)

If you are applying for data science jobs in the UK, the maths can feel like a moving target. Job descriptions say “strong statistical knowledge” or “solid ML fundamentals” but they rarely tell you which topics you will actually use day to day. Here’s the truth: most UK data science roles do not require advanced pure maths. What they do require is confidence with a tight set of practical topics that come up repeatedly in modelling, experimentation, forecasting, evaluation, stakeholder comms & decision-making. This guide focuses on the only maths most data scientists keep using: Statistics for decision making (confidence intervals, hypothesis tests, power, uncertainty) Probability for real-world data (base rates, noise, sampling, Bayesian intuition) Linear algebra essentials (vectors, matrices, projections, PCA intuition) Calculus & gradients (enough to understand optimisation & backprop) Optimisation & model evaluation (loss functions, cross-validation, metrics, thresholds) You’ll also get a 6-week plan, portfolio projects & a resources section you can follow without getting pulled into unnecessary theory.