Airtime
At Airtime we are all about innovation, because this is how we stay on top. Every one of us has a hunger to succeed and will stop for nothing less than excellence. Crucially, our ethos is underpinned by a culture of teamwork and our shared humility because all that we achieve, we achieve together.
Empowering
We keep the experience fresh with an innovative, original approach, marked by continuous introduction of unique features and benefits. We are about fresh, adaptable and impactful change that sets new standards and differentiates from competitors.
Magnetic
Genuinely engaging and deeply trustworthy. We connect easily, making every experience with us naturally appealing and memorable. Even the way we transform data into engaging, personalised insights is fun and visually appealing.
Uplifting
Bright and optimistic, we offer a positive escape from the mundane. We bring joy to everyday life, transforming routine into moments of happiness and satisfaction. Feel good with every interaction.
The Opportunity
We are looking to recruit a Data Engineer to join and strengthen our existing team and contribute to the growth and development of our data services.
This will be an exciting role, with the chance to work in a highly skilled technical role that offers great opportunities for learning and personal development.
The ideal candidate will be passionate about data, with a technical mindset and excellent problem-solving skills. The role will suit someone who is curious, detail-oriented, and looking to rapidly develop their skills in a dynamic environment.
The ideal candidate should be able to:
- Design, implement, and maintain robust, scalable data pipelines to ingest data from internal platforms into our data warehouse.
- Optimise data pipelines to enhance performance, reduce costs, and ensure data quality..
- Understand, gather, and document detailed business requirements.
- Take ownership of data projects from planning to delivery, collaborating with other departments as needed.
- Innovate and automate current processes, driving continuous improvement.
- Express desire for operational excellence & commitment to secure coding standards and best practices.
Requirements
- Bachelor’s degree in Computer Science, a relevant technical field, or equivalent experience.
- Experience designing cloud data warehouse solutions, data modelling, and building ETL/ELT processes, preferably on GCP or equivalent platforms (AWS, Azure, Snowflake).
- Strong knowledge of SQL, Python, Docker, and Terraform (or similar IaC tools).
- Experience with Dataform or dbt.
- Strong knowledge of security best practices, data privacy, and GDPR compliance.
- Proven track record optimising data pipelines & query performance.
- Experience in building CI/CD pipelines for automated deployments.
Desirable Skills
- Familiarity with GCP products, especially: BigQuery, Composer, Cloud Functions.
- Knowledge of data science fundamentals (supervised/unsupervised learning, hyperparameter optimisation, model performance evaluation).
- Experience in building AI/ML models and ML pipelines.
- Hands-on experience with data visualisation and BI tools.
- Prior experience in the Fintech sector working with open banking data.
Colleague Benefits
- Share options.
- 23 days annual leave, plus one for each year served (capped at 28).
- Birthday leave.
- Learning & development budget / time allocation
- Flexible start & finish hours 06:30 - 10:30 am
- Life assurance at 5x salary
- Health cash plan
- Virtual GP appointments for you and your family
- 24/7 helpline for physical and mental health support, counselling, and other wellbeing resources
- Private Medical Insurance
- Hybrid working between home and office
- City centre location with brand new fit out (when in the office)
- Buy a holiday scheme
- Charity day
- Charity contribution
- Professional accreditation funding
- Enhanced Maternity, Paternity & Adoption leave pay