Jobs

DataOps Engineer


Job details
  • Causaly
  • London
  • 4 weeks ago
Applications closed

About us

Founded in 2018, Causaly accelerates how humans acquire knowledge and develop insights in Biomedicine. Our production-grade generative AI platform for research insights and knowledge automation enables thousands of scientists to discover evidence from millions of academic publications, clinical trials, regulatory documents, patents and other data sources… in minutes. 

We work with some of the world's largest biopharma companies and institutions on use cases spanning Drug Discovery, Safety and Competitive Intelligence. You can read more about how we accelerate knowledge acquisition and improve decision making in our blog posts here: Blog - Causaly 

We are backed by top VCs including ICONIQ, Index Ventures, Pentech and Marathon. 

Who we are looking for

We are looking for talented Data Engineers with a passion for DataOps and a demonstrable background in SQL and Python-based automation. You will join our Data & Semantic Technologies team, responsible for delivering the scalable and highly flexible data fabric that is the foundation of Causaly’s product suite. This team is enabling and empowering new product developments as well as innovations in AI to create true business value. You will be unleashing the value of data for our customers through building and operating automated data pipelines, feeding our constantly growing data warehouse and knowledge graph, evolving our data architectures, etc. 

We are a multi-disciplinary team working in a fast-paced and collaborative environment, who value honest opinion and open debate. You have a problem-solving mind-set with a hands-on attitude, you are keen to design and build innovative solutions that leverage the value of data, you are passionate and creative in your work, you love to share ideas with your team and can pick the right tool for the job? Then you should become part of our journey! 

What you can expect to work on: 

  • Gather and understand data based on business requirements 
  • Regularly import and transform big data (millions of records) from various formats (e.g. CSV, SQL, JSON) to data stores like BigQuery and Neo4j 
  • Process data further using SQL and/or Python, e.g., to sanitise fields, aggregate records, combine with external data sources 
  • Work with other engineers on highly performant data pipelines and efficient data operations, adhering to the industry’s best practices and technologies for scalability, fault tolerance and reliability 
  • Export data in well-defined target formats and schemata, ensure and validate data output and quality, produce corresponding reports and dashboards 
  • Manage and improve (legacy) data pipelines in the cloud - enable other engineers to run them efficiently 
  • Innovate on our data warehouse architecture and usage 
  • Work directly with a multitude of technical, product and business stakeholders 
  • Mentor and guide junior members, shape our technology strategy and innovate on our data backbone 
  • Collaborate with the DevOps team to help manage our infrastructure 

Requirements

Minimum Requirements 

  • Significant industry experience working with SQL, automation, ETL, Linux 
  • Proven database skills and experience with traditional RDBMS like MySQL as well as modern systems like BigQuery 
  • Experience with data versioning, data-backup and data-recovery strategies 
  • Solid understanding of modern software-development practices (testing, version control, documentation, etc.) and hands-on coding experience in Python 
  • Experience with cloud computing providers like GCP/AWS 
  • Strong engineering background enabling rapid progression from ideation to proof-of-concept  
  • A product and user-centric mindset 
  • Excellent problem solving, ownership, organizational skills, with high attention to detail and quality 

Preferred Qualifications 

  • Experience with more data-storage and retrieval technologies, such as ElasticSearch, data warehouses, NoSQL, Neo4j 
  • Command-line and Linux scripting skills in production 
  • Have utilised DevOps tools and practices to build and deploy software 
  • Knowledge of Terraform, Kubernetes and or/Docker Containers 
  • Programming skills and experience in other languages, such as Node.js 

Benefits

  • Competitive compensation package 
  • Private medical insurance (underwritten on a medical health disregarded basis) 
  • Life insurance (4 x salary) 
  • Individual training/development budget through Learnerbly 
  • Individual wellbeing budget through Juno 
  • 25 days holiday plus public holidays and 1 day birthday leave per year 
  • Hybrid working (home + office) 
  • Potential to have real impact and accelerated career growth as an early member of a multinational team that's building a transformative knowledge product 

Be yourself at Causaly... Difference is valued. Everyone belongs. 

Diversity. Equity. Inclusion. They are more than words at Causaly. It's how we work together. It's how we build teams. It's how we grow leaders. It's what we nurture and celebrate. It's what helps us innovate. It's what helps us connect with the customers and communities we serve. 

We are on a mission to accelerate scientific breakthroughs for ALL humankind, and we are proud to be an equal opportunity employer. We welcome applications from all backgrounds and fairly consider qualified candidates without regard to race, ethnic or national origin, gender, gender identity or expression, sexual orientation, disability, neurodiversity, genetics, age, religion or belief, marital/civil partnership status, domestic / family status, veteran status or any other difference. 

Sign up for our newsletter

The latest news, articles, and resources, sent to your inbox weekly.

Similar Jobs

DataOps Engineer

DataOps EngineerBristolPermanentSalary: Up to £80,000HybridAre you passionate about shaping the future of data operations with cutting-edge machine learning technology? Join a dynamic team of specialist data scientists and engineers, where we’re revolutionizing engagement by delivering personalized, data-driven content experiences that maximize audience retention and business success.As a DataOps Engineer, you...

We Are 5 Values Bristol

Data Engineer (DevOps)

Data Ops (DevOps) Engineer – Fully Remote – Up to £80kAzure | Terraform | Data Factory | DataLakes | Cloud | DataOps | IAC | Infrastructure as code | Kubernetes | CI/CD | YAML | Power BIDo you want to work for a tech4good with their own data Centre of...

Opus Recruitment Solutions

Data Engineer (DevOps)

Data Ops (DevOps) Engineer – Fully Remote – Up to £80kAzure | Terraform | Data Factory | DataLakes | Cloud | DataOps | IAC | Infrastructure as code | Kubernetes | CI/CD | YAML | Power BIDo you want to work for a tech4good with their own data Centre of...

Opus Recruitment Solutions Bristol

Data Engineer (DevOps)

Data Ops (DevOps) Engineer – Fully Remote – Up to £80kAzure | Terraform | Data Factory | DataLakes | Cloud | DataOps | IAC | Infrastructure as code | Kubernetes | CI/CD | YAML | Power BIDo you want to work for a tech4good with their own data Centre of...

Opus Recruitment Solutions London

Senior Data Engineer

Senior Data Engineers£600-700/day overall assignment rate to umbrella6 month initial contractRemote workingWorking with a FTSE100 Client who are currently in the market for a Senior Data Engineer to join their Data Platform Team. You will work part of their exciting greenfield tech hub.Platform Engineer,key skills:MUST have extensive Databricks experienceExtensive ADF...

Sanderson

Senior Data Engineer

TransferRoom is a hyper-growth B2B SaaS Marketplace on a mission to change the football transfer market for the better. We do this by empowering football clubs, agents and players to be successful in the transfer market by giving them real-time market intelligence and direct access to a global network of...

TransferRoom London