EC Markets is building a state-of-the-art in-house data centre to power our trading, operational, and marketing intelligence.
We are seeking a Senior Data Engineer to lead the technical build-out of this initiative from designing ETL pipelines to creating a secure, scalable data warehouse that underpins business-critical reporting and analytics. This is a high-impact role at the intersection of technology and financial insight, ideal for a senior professional with deep data engineering capabilities and proven experience supporting financial or trading-driven organisations, willing to grow into a Head of Data.
Design, develop, and deploy a robust data infrastructure leveraging cloud-based services (AWS).
Build and maintain ETL pipelines feeding data from multiple internal systems (trading, CRM, RUM, finance) into a central Data Lakehouse.
Implement best practices for data ingestion, validation, transformation, and storage using modern cloud tools (e.g., Databricks, S3 Data Lakes, Spark, Redshift, AWS Glue).
Working closely with data analysts to enable operational/regulatory reporting and data insights/visualisation.
Data Governance & Quality
Define and enforce data standards, quality assurance processes, and documentation across systems.
Ensure system scalability, performance, and data privacy/security align with compliance and business requirements.
Partner closely with trading, finance, marketing, and management teams to map data requirements and deliver analytics-ready structures.
Lead the technical implementation while coordinating with external vendors or internal IT teams as needed.
Define the core data architecture and choice of toolset.
Define data models.
Establish a secure and automated data development cycle.
Project Ownership
Drive the full lifecycle of the data centre project, establishing foundations for downstream reporting and BI functions.
Support future integration of a dedicated Data & Reporting Analyst role, planned for the following year.
Own data privacy and security aspects from a technology standpoint.
Degree in Computer Science, Data Engineering, or a related quantitative field.
~58 years of experience in data engineering or infrastructure development.
~ Experience in the financial services or fintech sector is highly desirable.
Hands-on experience designing and implementing ETL pipelines and data lake/warehouses in cloud environments.
Advanced knowledge of AWS or Azure data services (Databricks, Glue, Redshift, S3 Data Lakes, Spark, or equivalent).
Strong programming and data manipulation skills (Python, SQL).
Solid background in financial reporting, trading data, or analytics within financial markets.
Proven ability to manage complex, end-to-end data projects and collaborate cross-functionally across business units.
Experience in the financial services or fintech sector is highly desirable.