Capco Logo

Capco

Platform Engineer

Posted Yesterday
Be an Early Applicant
Hybrid
Pune, Maharashtra
Senior level
Hybrid
Pune, Maharashtra
Senior level
Design, provision, and manage Azure Databricks platforms for data transformation in insurance. Collaborate with teams, implement Infrastructure as Code, monitor performance, enforce security, and oversee integration of data governance tools.
The summary above was generated by AI

Job Title: Sr. Platform  Engineer 

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT

 

JOB SUMMARY:

  • Position: Sr Consultant
  • Location: Pune / Bangalore/ Mumbai/ Chennai/ Hyderabad/ Gurugram
  • Band: M3/M4 (7 to 14 years)

 Role Description:

  • Design, provision, and manage secure, scalable, and high-performance Azure Databricks platformstailored to support enterprise-wide data transformation for insurance data workloads.
  • Collaborate with architects, engineers, and security teams to define and implement robust infrastructure standards, ensuring reliable connectivity and integration with legacy systems, cloud data sources, and third-party platforms.
  • Implement Infrastructure as Code solutions(e.g., Terraform) to streamline provisioning and configuration of Azure and Databricks resources, supporting DevOps best practices.
  • Automate environment deployment, monitoring, and incident response workflows using GitHub Actions to increase consistency, traceability, and efficiency.
  • Monitor platform health, resource utilization, and performance; anticipate scaling needs and conduct regular tuning to maintain optimal operation for data pipelines and analytics workloads.
  • Enforce security and compliance with enterprise and regulatory standards, including RBAC, managed identities, encryption, PDPO, GDPR, and other insurance-specific requirements.
  • Oversee integration of Informatica tools to support data governance, including cataloguing, data lineage, and compliance checks across the platform.
  • Document infrastructure architecture, network topology, security configurations, and operational runbooks to support ongoing governance, audit, and handover.
  • Troubleshoot infrastructure issues, perform root cause analysis, and drive resolution and continuous improvement for platform reliability.

·                Stay current with new Azure, Databricks, and DevOps features—continuously recommending and implementing enhancements to platform capabilities, cost-effectiveness, and effectiveness for Hong Kong Life and General Insurance business priorities.


Requirement:

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • 5+ years of experience designing, deploying, and administering Azure cloud platforms, with a focus on supporting Azure Databricks for insurance data and analytics workloads.
  • Deep expertise provisioning, configuring, and managing Azure Databricks clusters and workspaces, including supporting the processing and storage of structured, semi-structured, and unstructured insurance data.
  • Skilled in integrating Azure Data Factory and Azure Data Lake Storage Gen2 with Databricks for seamless, automated data flows.
  • Proficient in using infrastructure-as-code tools (Terraform) for automated deployment and configuration of Azure and Databricks services.
  • Experience deploying and integrating Informatica solutions for comprehensive metadata management, cataloguing, and governance.
  • Strong understanding of platform security (RBAC, NSG, managed identities, Key Vault), monitoring, alerting, and cost optimization in regulated insurance environments.
  • Hands-on with GitHub Actions for CI/CD pipeline automation related to platform and pipeline deployments.
  • Experience with platform incident response, troubleshooting, and performance optimization for mission-critical insurance data workloads.

·                Excellent documentation, collaboration, and communication skills to support technical and business users in the insurance domain.

WHY JOIN CAPCO? 

You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. We offer: • A work culture focused on innovation and creating lasting value for our clients and employees • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients • A diverse, inclusive, meritocratic culture

We offer:

  • A work culture focused on innovation and creating lasting value for our clients and employees
  • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
  • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients

#LI-Hybrid

Top Skills

Azure
Azure Data Factory
Azure Data Lake Storage Gen2
Databricks
Github Actions
Informatica
Terraform

Similar Jobs at Capco

2 Days Ago
Hybrid
Pune, Maharashtra, IND
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Power Platform Developer designs, develops, and troubleshoots solutions using Microsoft Power Platform, leads project delivery, and mentors junior team members, ensuring effective collaboration.
Top Skills: C#Co-Pilot StudioHTMLJavaScriptJSONAzureMicrosoft Power PlatformPower AppsPower AutomatePower PagesRestful Web ApisTypescript
Yesterday
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Data Engineer will design and implement scalable cloud solutions using GCP and Big Data technologies, focusing on Python for scripting and programming.
Top Skills: Big DataBigQueryCloud DataflowCloud RunCloud StorageGoogle Cloud PlatformHadoopPythonSpark
Yesterday
Remote or Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Data Modeller/Architect will define and manage data models, ensure adherence to banking standards, and collaborate with stakeholders to automate business processes.
Top Skills: Data Modeling

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account