Data Engineer
Position: Data Engineer
Location: Cambridge / Luton, UK (Hybrid 2-3 days onsite in a week)
Duration: Long Term B2B Contract
Job Description:
The ideal candidate with a minimum of 5 +years of experience having strong experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT Pipelines using different resources.
Proficiency in Snowflake data warehouse architecture Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake.
Experience with DBT (Data Build Tool) for data transformation and modelling. Implement data transformation workflows using DBT (core/cloud).
Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.
Proficiency in SQL performance tuning, and query optimization techniques using snowflake.
Troubleshoot and optimize DBT models, and Snowflake performance.
Knowledge of CI/CD, version control (Git) tools. Experience with orchestration tools such as Airflow,
Strong analytical and problem-solving skills with an ability to work in agile development environment independently.
Ensure data quality, reliability, and consistency across different environments.
Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.
Certification in AWS, Snowflake, or DBT is a plus.
Requirements
Position: Data Engineer
Location: Cambridge / Luton, UK (Hybrid 2-3 days onsite in a week)
Duration: Long Term B2B Contract
Job Description:
The ideal candidate with a minimum of 5 +years of experience having strong experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT Pipelines using different resources.
Proficiency in Snowflake data warehouse architecture Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake.
Experience with DBT (Data Build Tool) for data transformation and modelling. Implement data transformation workflows using DBT (core/cloud).
Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.
Proficiency in SQL performance tuning, and query optimization techniques using snowflake.
Troubleshoot and optimize DBT models, and Snowflake performance.
Knowledge of CI/CD, version control (Git) tools. Experience with orchestration tools such as Airflow,
Strong analytical and problem-solving skills with an ability to work in agile development environment independently.
Ensure data quality, reliability, and consistency across different environments.
Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.
Certification in AWS, Snowflake, or DBT is a plus.
Snowflake, DBT, Python, ETL