FNZ Group Logo

FNZ Group

Data Engineer

Posted Yesterday
Be an Early Applicant
In-Office
Pune, Maharashtra, IND
Mid level
In-Office
Pune, Maharashtra, IND
Mid level
The Data Engineer will develop and maintain the Analytical Warehouse, focusing on data ingestion, transformation pipelines, and ensuring data quality for analytics in wealth management.
The summary above was generated by AI

Job Title: Data Engineer — Analytical Warehouse (FNZ)

About FNZ:

FNZ is a global fintech firm transforming the way financial institutions serve their clients. By

combining cutting-edge technology, infrastructure, and investment operations, FNZ

enables wealth management firms to deliver personalized investment solutions at scale.

Operating across multiple regions and supporting over $1.5 trillion in assets under

administration, FNZ partners with leading banks, insurers, and asset managers to create

seamless and innovative wealth platforms that empower millions of investors worldwide.

Job Summary:

We are seeking a hands-on Data Engineer to build and maintain the Analytical Warehouse

on Microsoft Fabric. This role focuses on developing data pipelines that ingest enriched

Gold-layer data from the NRT-ODS streaming platform into OneLake, building

transformation layers using SQL-based transformation frameworks or Fabric notebooks,

and delivering analytical datasets to wealth management clients. You will work at the

intersection of the real-time ODS and the analytical lakehouse, enabling historical

analytics, business intelligence, and client-facing reporting.

Key Responsibilities:

• Kafka-to-Fabric Ingestion: Build and maintain the Kafka Connect sink connectors

that write Gold topics from the NRT-ODS into Fabric OneLake in Delta/Parquet

format. Ensure near real-time ingestion with automatic schema evolution via Avroto-Delta mapping.

• Data Pipeline Development: Develop data transformation pipelines within

Microsoft Fabric using Fabric notebooks (PySpark/Spark SQL), Dataflows, and Data

Factory pipelines. Implement Bronze/Silver/Gold layering within the Analytical

Warehouse.

• Data Transformations: Build and maintain SQL-based transformation models that

convert raw ingested data into analytical datasets. Implement incremental models,

snapshot tables, and materializations optimized for analytical query patterns.

• OneLake Storage Management: Design and manage OneLake storage structures

— partition strategies (by date, entity type, client), file compaction, retention

policies, and storage optimization for cost and query performance.

• Batch Extract Modernization: Migrate existing batch extract processes from SQLdriven CSV to Kafka-sourced Parquet via Fabric pipelines. Retain metadata-driven

configuration from CentralHub while outputting to OneLake in Parquet/Delta

format.

• Semantic Layer Development: Build semantic layer definitions for businessfriendly metrics — AUM, NAV, trade volumes, fee breakdowns, client counts —

ensuring consistent metric definitions across all consumption channels.

• Data Sharing: Implement Fabric Data Sharing using OneLake shortcuts or Delta

Sharing for clients who consume analytics in their own Fabric tenant. Ensure

governed access where clients see only their own data.

• Data Quality: Implement data quality checks within the Analytical Warehouse using

Great Expectations or Soda. Validate row counts, null rates, referential integrity, and

freshness against defined data contracts.

• Performance Optimization: Tune query performance across Fabric SQL endpoints,

optimize Delta table layouts (Z-ordering, partitioning, file sizing), and manage

compute resource allocation.

• CI/CD & DevOps: Implement CI/CD pipelines for Analytical Warehouse artifacts

(transformation models, Fabric notebooks, pipeline definitions) using GitHub

Actions. Follow GitOps practices for deployment.

Qualifications:

• Education: Bachelor's degree in Computer Science, Engineering, Data Science, or a

related technical field.

• Experience: 4+ years of hands-on experience in data engineering with a focus on

analytical/warehouse workloads.

• Microsoft Fabric / Azure: Demonstrated experience with Microsoft Fabric, Azure

Synapse Analytics, or Azure Data Factory. Familiarity with OneLake, Fabric

notebooks, and Fabric SQL endpoints.

• SQL Expertise: Strong SQL skills including complex analytical queries, window

functions, CTEs, and query performance tuning.

• Spark / PySpark: Proficiency in PySpark or Spark SQL for large-scale data

transformations.

• Data Transformation Frameworks: Experience with SQL-based transformation

frameworks for managing transformation layers — models, tests, documentation,

and incremental materializations.

• Delta Lake / Parquet: Understanding of Delta Lake table format — ACID

transactions, time travel, schema evolution, partition management, and file

compaction.

• Kafka Fundamentals: Working knowledge of Apache Kafka — consumer concepts,

Kafka Connect, Avro serialization — sufficient to build and troubleshoot the

ingestion layer from ODS Gold topics.

• CI/CD: Experience with CI/CD pipelines (GitHub Actions preferred) for data pipeline

deployments.

Preferred Qualifications:

• Experience working in the Wealth Management or Financial Services industry with

understanding of investment data domains (accounts, portfolios, transactions,

positions).

• Experience with Apache Iceberg table format for time-travel queries and multiengine access.

• Familiarity with data quality frameworks such as Great Expectations or Soda

integrated into data pipelines.

• Experience with semantic layer tools for defining governed business metrics.

• Exposure to data catalog and lineage tools (Purview, Atlan, or similar).

• Microsoft Fabric certifications or Azure Data Engineer certifications (DP-203) are a

plus.

About FNZ

FNZ is committed to opening up wealth so that everyone, everywhere can invest in their future on their terms. We know the foundation to do that already exists in the wealth management industry, but complexity holds firms back. 

We created wealth’s growth platform to help. We provide a global, end-to-end wealth management platform that integrates modern technology with business and investment operations. All in a regulated financial institution. 

We partner with the world’s leading financial institutions, with over US$2.4 trillion in assets on platform (AoP).
Together with our clients, we empower nearly 30 million people across all wealth segments to invest in their future.

Top Skills

Apache Kafka
Azure Data Factory
Azure Synapse Analytics
Delta Lake
Github Actions
Microsoft Fabric
Parquet
Pyspark
Spark Sql
SQL

Similar Jobs

2 Days Ago
In-Office
Pune, Maharashtra, IND
Mid level
Mid level
Healthtech • Logistics • Pharmaceutical
Seeking an experienced Data Engineer to develop, maintain, and enhance data pipelines and lake solutions for Internal Audit using Databricks and Azure.
Top Skills: AirflowAzureAzure Data FactoryDatabricksDelta LakeEltETLPysparkPythonSap EccSap S/4HanaSQL
4 Days Ago
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Data Engineer will design, develop, and maintain ETL/ELT data pipelines, leveraging PySpark and Databricks to process large datasets, ensuring data quality, reliability, and performance optimization.
Top Skills: Apache AirflowDatabricksEmr ServerlessPysparkS3SQL
18 Days Ago
In-Office
Expert/Leader
Expert/Leader
Fintech • Information Technology • Financial Services
This role involves leading the design, build, and operation of enterprise data platforms using AI/ML, ensuring scalability and compliance while providing technical leadership and collaborating across teams.
Top Skills: AWSAzureGCPPythonSQL

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account