Principal Reporting and Data Engineer

The Aztec Group
Southampton
3 days ago
Create job alert

At Aztec, we provide talented and dynamic individuals with the opportunity to build a rewarding career.We’re an ambitious company, committed to building long-term partnerships with our clients and delivering operational excellence at every stage of their fund’s lifecycle.Our culture is what makes us a standout place to work. Our people are at the heart of our business: putting clients first; delivering service excellence; working as one team; building lasting connections and embodying our values and purpose.Join our journey and discover what makes us the bright alternative.About the role:The Senior Manager – Principal Reporting & Data Engineer is the senior technical authority for complex client reporting and structured data architecture derived from eFront Invest.This role defines the standards, patterns, and technical approach for extracting, modelling, and delivering high-complexity reporting outputs and two-way client data exchanges between Aztec and client platforms. It ensures solutions are scalable, controlled, auditable, and aligned to Aztec’s data governance and ISAE control environment.Operating on the technical career track, the role provides domain technical leadership without direct line management responsibility, influencing delivery quality through design oversight, standards, lifecycle governance, and mentoring.Key Responsibilities:1. Reporting & Data Architecture Ownership Define and maintain design standards and architectural principles for complex reporting and structured data extraction from eFront Invest. Act as design authority for high-complexity and high-risk reporting requirements, ensuring solutions are robust, performant, and reusable. Establish reusable reporting frameworks, extraction patterns, and transformation approaches to reduce bespoke build and improve scalability. Drive performance optimisation of queries and reporting pipelines, ensuring predictable and efficient execution in production.2. Full Delivery Lifecycle Oversight (Dev → SIT → UAT → Prod) Provide technical oversight of the full reporting development lifecycle, including design, build, validation and controlled promotion through: + Development → SIT → UAT → Production (PrD) Define test strategies, test data requirements, and evidence standards for SIT/UAT cycles, ensuring audit-ready documentation is retained. Coordinate with operational stakeholders to ensure release readiness, cutover plans, and controlled deployment/rollback procedures where applicable.3. eFront Invest Data Model Authority Serve as subject-matter expert in eFront Invest backend data structures, including fund/entity hierarchies, capital activity, allocations, valuation and performance.* Define structured approaches to accessing and modelling eFront data for reporting, downstream datasets, and client consumption.* Provide technical guidance during new client onboarding, complex fund launches, and major reporting changes where data integrity is critical.* Influence platform configuration decisions where they materially impact reporting outcomes, data integrity, or auditability.4. Two-Way Client Data Exchanges & Integration Design* Define technical patterns for two-way integrations between Aztec and client platforms, including: + Outbound reporting/data delivery (e.g., scheduled extracts, datasets, dashboards) + Inbound client data ingestion (e.g., client reference data, portfolio data, enrichment datasets)* Support secure data exchange approaches via: + SFTP feeds and scheduled extracts/uploads + API-based delivery and ingestion where applicable + DDS / data platform outputs for structured client consumption* Establish reconciliation, completeness checks, validation controls, and exception handling standards for both inbound and outbound data flows.* Work closely with Integration Engineers to ensure interfaces are resilient, monitored, and aligned to enterprise integration standards.5. Stakeholder & Client Collaboration* Work in close partnership with Client Facing Teams (CFT) to define reporting requirements, prioritise deliverables, and support client-facing commitments.* Engage directly with key clients (as required) to validate requirements, explain data structures, confirm mapping decisions, and ensure mutual understanding of deliverables.* Collaborate across MTS with solution architects, application SMEs/platform owners (eFront and adjacent systems), data engineers, and reporting teams to ensure end-to-end alignment.6. Governance, Controls & Audit Readiness* Ensure reporting designs and outputs comply with Aztec’s ISAE 3402 controls, change governance, and documentation standards.* Define and enforce version control, peer review, testing evidence, and approval practices for complex reporting logic and data interfaces.* Act as technical escalation point during audits or control reviews relating to reporting outputs, data extracts, reconciliations, and client exchanges.* Contribute to control design and remediation where reporting logic, data sourcing, or interfaces present operational risk.7. Technical Leadership & Capability Uplift* Set technical standards for SQL development, naming conventions, and data transformation approaches across the reporting community.* Mentor and coach reporting engineers and analysts through technical design reviews and best-practice guidance.* Reduce key-person dependency by improving documentation, reusable components, and repeatable delivery approaches.* Identify opportunities to standardise recurring client reporting requirements into scalable templates and assets.Skills, Knowledge & Expertise:* Deep expertise in eFront Invest data structures, reporting logic, and private markets data semantics.* Advanced SQL and structured data modelling capability, including performance tuning and scalable query design.* Strong understanding of private markets fund accounting concepts (capital activity, allocations, valuation/performance, investor reporting).* Experience designing and supporting two-way client data exchanges and controlled reporting pipelines.* Strong documentation discipline and ability to operate effectively within audit-controlled environments.* Strong stakeholder engagement capability, including working with CFTs and directly with clients.Qualifications & Experience:* Typically 7–10+ years’ experience in reporting, data engineering, or financial systems roles within private markets, fund administration, or financial services.* Proven track record delivering complex client reporting and structured data outputs, including SIT/UAT/Prod release discipline.* Degree in Finance, Data, Information Systems, Engineering, Mathematics, or equivalent professional experience.**Career Development & Opportunity:**This role is a key technical leadership position within Markets Technology Services. It offers progression into broader platform architecture or technology leadership pathways, with scope to expand technical ownership across additional data domains and enterprise client delivery services as Aztec scales its reporting automation and data-as-a-service capabilities.Aztec will provide the training, both in-house for relevant technical knowledge and also professional qualifications to enhance your professional development. You will need to be quick to learn new systems and great with people, as close working relationships between our colleagues and clients is at the heart of what we do.### Join our Talent CommunityAt Aztec, we provide talented and dynamic individuals with the opportunity to build a rewarding career. We’re an ambitious company, committed to building long-term partnerships with our clients and delivering operational excellence at every stage of their fund’s lifecycle. Our culture is what makes us a standout place to work. Our people are at the heart of our business: putting clients
#J-18808-Ljbffr

Related Jobs

View all jobs

Principal Configuration and Data Engineer

Principal Configuration and Data Engineer

Principal Configuration and Data Engineer

Principal Configuration and Data Engineer

Principal Configuration and Data Engineer

Principal Configuration and Data Engineer

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

How Many Data Science Tools Do You Need to Know to Get a Data Science Job?

If you’re trying to break into data science — or progress your career — it can feel like you are drowning in names: Python, R, TensorFlow, PyTorch, SQL, Spark, AWS, Scikit-learn, Jupyter, Tableau, Power BI…the list just keeps going. With every job advert listing a different combination of tools, many applicants fall into a trap: they try to learn everything. The result? Long tool lists that sound impressive — but little depth to back them up. Here’s the straight-talk version most hiring managers won’t explicitly tell you: 👉 You don’t need to know every data science tool to get hired. 👉 You need to know the right ones — deeply — and know how to use them to solve real problems. Tools matter, but only in service of outcomes. So how many data science tools do you actually need to know to get a job? For most job seekers, the answer is not “27” — it’s more like 8–12, thoughtfully chosen and well understood. This guide explains what employers really value, which tools are core, which are role-specific, and how to focus your toolbox so your CV and interviews shine.

What Hiring Managers Look for First in Data Science Job Applications (UK Guide)

If you’re applying for data science roles in the UK, it’s crucial to understand what hiring managers focus on before they dive into your full CV. In competitive markets, recruiters and hiring managers often make their first decisions in the first 10–20 seconds of scanning an application — and in data science, there are specific signals they look for first. Data science isn’t just about coding or statistics — it’s about producing insights, shipping models, collaborating with teams, and solving real business problems. This guide helps you understand exactly what hiring managers look for first in data science applications — and how to structure your CV, portfolio and cover letter so you leap to the top of the shortlist.

The Skills Gap in Data Science Jobs: What Universities Aren’t Teaching

Data science has become one of the most visible and sought-after careers in the UK technology market. From financial services and retail to healthcare, media, government and sport, organisations increasingly rely on data scientists to extract insight, guide decisions and build predictive models. Universities have responded quickly. Degrees in data science, analytics and artificial intelligence have expanded rapidly, and many computer science courses now include data-focused pathways. And yet, despite the volume of graduates entering the market, employers across the UK consistently report the same problem: Many data science candidates are not job-ready. Vacancies remain open. Hiring processes drag on. Candidates with impressive academic backgrounds fail interviews or struggle once hired. The issue is not intelligence or effort. It is a persistent skills gap between university education and real-world data science roles. This article explores that gap in depth: what universities teach well, what they often miss, why the gap exists, what employers actually want, and how jobseekers can bridge the divide to build successful careers in data science.