Capco Logo

Capco

DBT Developer

Reposted 20 Days Ago
Be an Early Applicant
Hybrid
Pune, Maharashtra
Mid level
Hybrid
Pune, Maharashtra
Mid level
Design, build, and maintain data transformation pipelines using dbt in a Snowflake environment, collaborating with engineers and analysts to create reliable data models.
The summary above was generated by AI

Job Title:  DBT Developer - Pune

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT

Job Title: DBT Developer - Pune

Location: Pune

Work Mode: Hybrid (3 days WFO - Tues, Wed, Thurs)

Shift Time: 12.30 PM TO 9.30 PM


Job Summary

We are seeking a skilled and detail-oriented DBT Engineer to join our cross-functional Agile team. In this role, you will be responsible for designing, building, and maintaining modular, reliable data transformation pipelines using dbt (Data Build Tool) in a Snowflake environment. You will collaborate closely with backend and frontend engineers, product managers, and analysts to create analytics-ready data models that power application features, reporting, and strategic insights. This is an exciting opportunity for someone who values clean data design, modern tooling, and working at the intersection of engineering and business.

Key Responsibilities

  • Design, build, and maintain scalable, modular dbt models and transformation pipelines using DBT Core. DBT Cloud experience is good to have.
  • Understand DBT Architecture thoroughly and experience in writing Python operators in DBT flow. Strong experience in writing Jinja code, macros, seeds etc.
  • Write SQL to transform raw data into curated, tested datasets in Snowflake.
  • Knowledge of data modeling techniques like data vault and dimensional modeling (Kimball/Inmon).
  • Collaborate with full-stack developers and UI/UX engineers to support application features that rely on transformed datasets.
  • Work closely with analysts and stakeholders to gather data requirements and translate them into reliable data models.
  • Enforce data quality through rigorous testing, documentation, and version control in dbt.
  • Participate in Agile ceremonies (e.g., stand-ups, sprint planning) and manage tasks using Jira.
  • Integrate dbt into CI/CD pipelines and support automated deployment practices.
  • Monitor data performance and pipeline reliability, and proactively resolve issues.

Mandatory Qualifications & Skills

  • 3–5 years of experience in data engineering or analytics engineering, with a focus on SQL-based data transformation.
  • Hands-on production experience using dbt core or dbt cloud as a primary development tool.
  • Strong command of SQL and solid understanding of data modeling best practices (e.g., star/snowflake schema).
  • Proven experience with Snowflake as a cloud data warehouse.
  • Python skills for data pipeline integration or ingestion.
  • Familiarity with Git-based version control workflows.
  • Strong communication and collaboration skills, with the ability to work across engineering and business teams.
  • Experience working in Agile/Scrum environments and managing work using Jira.

Nice-to-Have Skills

  • Knowledge of data orchestration tools (e.g., Apache Airflow) is a big plus.
  • Exposure to CI/CD pipelines and integrating dbt into automated workflows.
  • Experience with cloud platforms such as AWS.
  • Familiarity with Docker and container-based development.
  • Understanding of how data is consumed in downstream analytics tools (e.g., Looker, Tableau, Power BI).

Preferred Experience

  • A track record of building and maintaining scalable dbt projects in a production setting.
  • Experience working in cross-functional teams involving developers, analysts, and product managers.
  • A strong sense of ownership, documentation habits, and attention to data quality and performance.

If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Top Skills

Agile
AWS
Dbt
Docker
Git
JIRA
Snowflake
SQL

Similar Jobs at Capco

11 Hours Ago
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Scrum Master will guide a technical team in Agile practices, facilitate Scrum ceremonies, and ensure high performance by removing obstacles and empowering the team.
Top Skills: AgileGCPJIRAScrum
11 Hours Ago
Hybrid
Pune, Maharashtra, IND
Expert/Leader
Expert/Leader
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As a GCP Tech Lead, you will develop and maintain data pipelines, perform ETL processes, and ensure data quality, while collaborating in a team setting.
Top Skills: BigQueryCloud ComposerCloud FunctionsDataflowGoogle Cloud PlatformPub/SubPythonSQL
11 Hours Ago
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The GCP Tech Lead oversees a team of engineers focused on cloud migration and development, leveraging GCP technologies and ensuring project delivery excellence.
Top Skills: Big QueryCloud SqlConfluenceData WarehousingETLGCPGitJIRAPl/SqlPythonRdbmsShell ScriptingSQL

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account