Job Description
Position: Data Engineer
Location: Coventry, UK (Hybrid Work Arrangement)
Start Date: First or Second Week of September 2024
Experience: Minimum of 7 years
Employment Type: Permanent
About the Role:
Scrumconnect Consulting is seeking a highly skilled Data Engineer to join our innovative team. In this role, you will be responsible for building and optimizing our data pipelines and supporting the data needs of our business intelligence and analytics teams. If you have a passion for data and a knack for solving complex challenges, we invite you to apply.
Key Responsibilities:
- Implement data flows to connect operational systems with data for analytics and business intelligence (BI) systems.
- Document source-to-target mappings and re-engineer manual data flows to enable scalability and repeatable use.
- Support the build of data streaming systems to handle large volumes of data in real time.
- Write ETL (Extract, Transform, Load) scripts and code to ensure optimal performance of the ETL processes.
- Develop reusable business intelligence reports and build accessible data for analysis.
- Collaborate with Data Scientists to build analytics tools that utilize the data pipeline to provide actionable insights for management, operational efficiency, and other key business performance metrics.
Essential Skills and Experience:
- Hands-on coding experience in R or Python, with a solid understanding of programming paradigms and concepts needed in Data Engineering.
- Expertise in working with numpy, scipy, pandas, and data visualization tools in Python or equivalent packages in R.
- Strong knowledge of SQL, with experience in working with both traditional RDBMS and distributed datasets.
- Deep understanding of SQL, cloud-based data pipelines, architectures, and data sets.
- Experience in writing complex queries against relational and non-relational data stores.
- Good understanding of Data Warehousing and Data Lakehouse concepts.
- Proven experience working with large data sets, data pipelines, and cloud services.
- Experience in building reliable, efficient data applications, systems, services, and platforms.
- Familiarity with big data tools such as Hadoop and Spark.
- Background in programming using Open Source technologies, along with Python, Java, or C++/.Net.
- Good written and verbal communication skills with a strong desire to work in cross-functional teams.
- Awareness of data security best practices and GDPR compliance toward personal data.
- Experience working in an agile environment or organizations with an agile culture.
Desirable Skills:
- Knowledge of Machine Learning and traditional Data Science concepts is beneficial but not mandatory.
- Understanding of design choices for data storage and data processing, particularly with a focus on cloud data services.
- Familiarity with technologies such as Apache Spark or Airflow.
- Experience using parallel computing to process large datasets and optimize computationally intensive tasks is advantageous but not mandatory.
Qualifications:
- Degree or equivalent in a relevant subject, such as Computer Science, Information Systems, or a related technical discipline.
Why Join Us:
- Competitive salary and benefits package.
- Opportunity to work with cutting-edge technologies in a dynamic environment.
- Collaborative and inclusive culture that values professional growth and continuous learning.
- Work-life balance with flexible working arrangements.
- Skilled worker visa can be processed for candidates outside of the United Kingdom.
Requirements
What we are looking for: The interview will consist of both behavioural questions which follow the STAR methodology based on your past experience (see interview tips below) as well as detailed technical interview across our main technology stack described below. Must have: - Proven hands on track record experience managing large cloud based infrastructure deployments via IaC on Azure via terraform. - Proven hands on track record experience managing VM deployments on Azure and via Ansible. - Proven hands on track record experience implementing CI / CD pipelines to build, package, deploy, test and promote to higher environments infrastructure as well as applications, preferably via ADO. - Detailed understanding of infrastructure main building blocks and design, how they all fit together and how to manage them via code. That includes Virtual Networks, Routing, DNS, NSGs, Identity Management, Firewall Setup, Security Policies, Secrets and Certificates management. - Experience with configuring SecOps tools, preferably SonarQube, Checkmarx, PMD and using pre commit hooks across git based repositories to support shift left approach on the organisation deliverables. - Ability to resolve complex problems and work collaboratively with multiple technical and non technical teams with different skill sets and backgrounds. - Last but certainly not least: Excellent verbal and written communication skills. Nice to have: - Experience with Linux and Windows System administration including OS patching, permissions management, security settings, ssh / RDP enablement, using debugging and tracing tools to triage common problems. - Experience with at least one market leading monitoring tool, understand how it operates, configured and managed.