GCP Data Engineer

Bodhi
Guildford
1 month ago
Applications closed

Related Jobs

View all jobs

Senior Data Engineer / GCP / Looker / Big Query

Data Engineer

Data Engineer

Lead Data Engineer

Senior Data Engineer (Python/PySpark & SQL)

Senior Data Engineer

REMOTE: with Occasional Office Visits to Surrey office


12 Month Contract


SUBSIDIARY / DEPARTMENT OVERVIEW:

The organisation is a globally recognised leader in technology and innovation, delivering advanced digital products and solutions used by millions of people worldwide. With a strong focus on cutting-edge technologies and continuous improvement, the company drives digital transformation across multiple markets.


The global software solutions and IT services division plays a key role in delivering enterprise-scale digital capabilities. This position sits within a newly established CDM Operations Team, supporting marketing activities across more than 20 European countries.


PURPOSE OF THE JOB

The organisation is seeking a skilled Google Cloud Data Engineer to design, implement, and optimise data solutions within the Google Cloud Platform (GCP) ecosystem. The successful candidate will collaborate with cross-functional teams to ensure effective data integration, governance, and analytics capabilities.


KEY ACCOUNTABILITIES

  • Design scalable, efficient data solutions using BigQuery and other GCP tools to support business intelligence and analytics requirements.
  • Work closely with stakeholders to gather data requirements and translate them into technical designs.
  • Build, maintain, and optimise ETL/ELT pipelines using tools such as Dataflow, Apache Beam, and Cloud Composer.
  • Integrate multiple data sources, including APIs, relational databases, and streaming platforms, into BigQuery.

BigQuery Optimisation & Performance Tuning

  • Optimise BigQuery queries and storage structures to ensure high performance and cost efficiency.
  • Implement partitioning and clustering strategies to enhance query performance.
  • Configure and manage GCP services such as Cloud Storage, Pub/Sub, and IAM to ensure secure and reliable data operations.
  • Apply best practices in cloud security and compliance.

Data Governance & Quality

  • Implement data quality and governance frameworks to ensure accuracy, consistency, and availability of data.
  • Establish monitoring and alerting mechanisms for pipelines and systems to proactively prevent and resolve issues.
  • Partner with data analysts, engineers, and business stakeholders to enable efficient data processing.
  • Provide technical guidance and support to team members.

KEY LIAISONS

  • European Regional Office

DIMENSIONS

  • Maintain strong working relationships with all key stakeholders.
  • Support and align activities with both marketing and operations teams.

SKILLS AND EXPERIENCE

  • Language Skills
  • Exceptional English communication skills, as the role involves collaboration with global teams.
  • Technical Expertise
  • Strong proficiency in BigQuery and SQL, including data modelling and query optimisation.
  • Hands-on experience with GCP services such as Cloud Storage, Cloud Composer, Dataflow, and Pub/Sub.
  • Familiarity with data pipeline frameworks such as Apache Beam and Airflow.
  • Strong programming skills in Python or Java for data processing and scripting.
  • Knowledge of shell scripting and cloud automation.
  • Proven experience designing and managing cloud-based data solutions.
  • Strong background in developing and maintaining ETL/ELT pipelines.
  • Demonstrated ability to optimise BigQuery performance and manage cloud costs effectively.
  • Experience implementing partitioning, clustering, and materialised views.
  • Soft Skills
  • Excellent analytical and problem-solving abilities.
  • Strong communication skills, with the ability to explain technical concepts to non-technical stakeholders.
  • Ability to work collaboratively in a fast-paced, evolving environment.
  • Certifications
  • Experience with Amazon Redshift for managing and optimising data warehouse solutions across multi-cloud environments.
  • Experience with Microsoft Azure tools, particularly Azure Data Factory (ADF).

CHALLENGE

The organisation operates within a fast-paced and evolving environment where processes and procedures frequently change. The successful candidate must stay up to date with technological developments and assess their potential business impact.


Note

This job description outlines the primary responsibilities of the role but does not represent an exhaustive list of duties. It is intended to clarify expectations between the Manager and the employee and may be amended in line with evolving business requirements.


#J-18808-Ljbffr

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Where to Advertise Data Science Jobs in the UK (2026 Guide)

Advertising data science jobs in the UK requires a different approach to most technical hiring. Data science spans a broad and often misunderstood spectrum — from statistical modelling and experimental design through to machine learning engineering, product analytics and AI research. The strongest candidates identify firmly with specific subdisciplines and are frustrated by adverts that conflate data scientist with data analyst, business intelligence developer or machine learning engineer. General job boards produce high application volumes for data roles but consistently fail to match specialist data science profiles with the right opportunities. This guide, published by DataScienceJobs.co.uk, covers where to advertise data science roles in the UK in 2026, how the main platforms compare, what employers should expect to pay, and what the data says about hiring across different role types.

New Data Science Employers to Watch in 2026: UK and International Companies Leading Analytics and AI Innovation

Data science has emerged as one of the most transformative forces across industries, turning raw information into actionable insights, predictive models, and AI-powered solutions. In 2026, the UK is witnessing a surge in organisations where data science is not just a support function but the core of their products and services. For professionals exploring opportunities on www.DataScience-Jobs.co.uk , identifying these employers early can provide a competitive advantage in a market with high demand for advanced analytics and machine learning expertise. This article highlights new and high-growth data science employers to watch in 2026, focusing on UK startups, scale-ups, and global firms expanding their data science operations locally. All of the companies included have recently raised investment, won high-profile contracts, or significantly scaled their analytics teams.

How Many Data Science Tools Do You Need to Know to Get a Data Science Job?

If you’re trying to break into data science — or progress your career — it can feel like you are drowning in names: Python, R, TensorFlow, PyTorch, SQL, Spark, AWS, Scikit-learn, Jupyter, Tableau, Power BI…the list just keeps going. With every job advert listing a different combination of tools, many applicants fall into a trap: they try to learn everything. The result? Long tool lists that sound impressive — but little depth to back them up. Here’s the straight-talk version most hiring managers won’t explicitly tell you: 👉 You don’t need to know every data science tool to get hired. 👉 You need to know the right ones — deeply — and know how to use them to solve real problems. Tools matter, but only in service of outcomes. So how many data science tools do you actually need to know to get a job? For most job seekers, the answer is not “27” — it’s more like 8–12, thoughtfully chosen and well understood. This guide explains what employers really value, which tools are core, which are role-specific, and how to focus your toolbox so your CV and interviews shine.