As a Lead Data Engineer, you will design and maintain scalable data pipelines, ensure data quality, and partner with teams to implement data strategies. You will lead innovations and architect data models to enhance data accessibility and compliance across the organization.
Job Description
Embrace this pivotal role as an essential member of a high performing team dedicated to reaching new heights in data engineering. Your contributions will be instrumental in shaping the future of one of the world's largest and most influential companies.
As a Lead Data Engineer at JPMorgan Chase within the Consumer & Community Banking in Connected Commerce Travel Technology Team, you are an integral part of an agile team that works to enhance, build, and deliver large-scale data collection, storage, access, and analytics in a secure, stable, and scalable way. Leverage your deep technical expertise and problem solving capabilities to lead and drive significant business impact and tackle a diverse array of challenges that span multiple data pipelines, data architectures, and other data consumers.
Job responsibilities
Required qualifications, capabilities, and skills
Preferred qualifications, capabilities, and skills
Embrace this pivotal role as an essential member of a high performing team dedicated to reaching new heights in data engineering. Your contributions will be instrumental in shaping the future of one of the world's largest and most influential companies.
As a Lead Data Engineer at JPMorgan Chase within the Consumer & Community Banking in Connected Commerce Travel Technology Team, you are an integral part of an agile team that works to enhance, build, and deliver large-scale data collection, storage, access, and analytics in a secure, stable, and scalable way. Leverage your deep technical expertise and problem solving capabilities to lead and drive significant business impact and tackle a diverse array of challenges that span multiple data pipelines, data architectures, and other data consumers.
Job responsibilities
- Lead the design, development, and maintenance of robust, scalable cloud-based data processing pipelines and infrastructure, ensuring adherence to engineering standards, governance frameworks, and industry best practices.
- Architect and refine data models for large-scale datasets, optimizing for efficient storage, high-performance retrieval, and advanced analytics while upholding data integrity and quality.
- Partner with cross-functional teams to translate complex business requirements into effective, scalable data engineering solutions that drive organizational value.
- Champion a culture of innovation and continuous improvement, proactively identifying and implementing enhancements to data infrastructure, processing workflows, and analytics capabilities.
- Define and execute data strategy, including the development of enterprise data models and the management of end-to-end data infrastructure-from design and construction to installation and ongoing maintenance of large-scale processing systems.
- Drive data quality initiatives, ensure seamless data accessibility for analysts and data scientists, and maintain strict compliance with data governance and regulatory requirements.
- Align data engineering practices with business objectives, ensuring solutions are both technically sound and strategically relevant.
- Author, review, and approve technical requirements and architectural designs, and lead process re-engineering efforts to deliver cost-effective, high-impact business solution
Required qualifications, capabilities, and skills
- Expert in at least two programming languages (Python, and Java)
- Expert in at least one distributed data processing framework (Spark)
- Expert in at least one cloud data Lakehouse platforms (AWS Data lake services or Databricks, if not Hadoop),
- Expert in at least one scheduling/orchestration tools ( Airflow, alternatively AWS Step Functions or similar)
- Expert with relational and NoSQL databases.
- Expert in data structures, data serialization formats (JSON, AVRO, Protobuf, or similar), and big-data storage formats (Parquet, Iceberg, or similar)
- Proficiency in microservices architecture, serverless computing and distributed cluster computing tools such as Docker, Kubernetes etc.
- Proficiency in one or more data modelling techniques (Dimensional, Data Vault, Kimball, Inmon, etc.)
- Experience with test-driven development (TDD) or behavior-driven development (BDD) practices, as well as working with continuous integration and continuous deployment (CI/CD) tools.
- Experience organizing and leading design workshops, coding sessions, and hackathons to promote a culture of excellence and innovation in data engineering.
- Expertise in architecting reusable, future-ready design patterns that address diverse use cases across the organization.
Preferred qualifications, capabilities, and skills
- Hands-on experience with Infrastructure as Code (IaC) tools, preferably Terraform; experience with AWS CloudFormation is also valued.
- Proficiency in cloud-based data pipeline technologies such as Fivetran, dbt, Prophecy.io, or similar platforms.
- Experience with front-end development frameworks, ideally React; familiarity with Angular is also beneficial.
- Strong working knowledge of the Snowflake data platform.
- Experience in budgeting and resource allocation for data engineering projects.
- Proven ability to manage vendor relationships effectively.
Top Skills
Airflow
Avro
Aws Cloudformation
Aws Data Lake Services
Aws Step Functions
Databricks
Dbt
Docker
Fivetran
Iceberg
Java
JSON
Kubernetes
Parquet
Protobuf
Python
Snowflake
Spark
Terraform
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

