The GCP Data Engineer designs and maintains data pipelines using Python and SQL on GCP, ensures efficient batch processing, and implements CI/CD practices.
Job Description:
Job Title: GCP Data Engineer, AS
Location: Pune, India
Role Description
- An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals.
- Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies.
- They have extensive knowledge of design and architectural patterns.
- They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers.
What we’ll offer you
As part of our flexible scheme, here are just some of the benefits that you’ll enjoy
- Best in class leave policy.
- Gender neutral parental leaves
- 100% reimbursement under childcare assistance benefit (gender neutral)
- Sponsorship for Industry relevant certifications and education
- Employee Assistance Program for you and your family members
- Comprehensive Hospitalization Insurance for you and your dependents
- Accident and Term life Insurance
- Complementary Health screening for 35 yrs. and above
Your key responsibilities
- Design, develop and maintain data pipelines using Python and SQL programming language on GCP.
- Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills.
- Work with Cloud Composer to manage and process batch data jobs efficiently.
- Develop and optimize complex SQL queries for data analysis, extraction, and transformation.
- Develop and deploy google cloud services using Terraform.
- Implement CI CD pipeline using GitHub Action
- Consume and Hosting REST API using Python.
- Monitor and troubleshoot data pipelines, resolving any issues in a timely manner.
- Ensure team collaboration using Jira, Confluence, and other tools.
- Ability to quickly learn new any existing technologies Strong problem-solving skills.
- Write advanced SQL and Python scripts.
- Certification on Professional Google Cloud Data engineer will be an added advantage.
Your skills and experience
- 6+ years of IT experience, as a hands-on technologist.
- Proficient in Python for data engineering.
- Proficient in SQL.
- Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE
- Hands on experience in REST API hosting and consumptions.
- Proficient in Terraform/ Hashicorp.
- Experienced in GitHub and Git Actions
- Experienced in CI-CD
- Experience in automating ETL testing using python and SQL.
- Good to have APIGEE.
- Good to have Bit Bucket
How we’ll support you
- Training and development to help you excel in your career.
- Coaching and support from experts in your team
- A culture of continuous learning to aid progression.
- A range of flexible benefits that you can tailor to suit your needs.
About us and our teams
Please visit our company website for further information:
https://www.db.com/company/company.htm
We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.
Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.
We welcome applications from all people and promote a positive, fair and inclusive work environment.
Top Skills
Apigee
BigQuery
Bitbucket
Ci-Cd
Cloud Composer
Cloud Function
Cloud Run
Data Flow
GCP
Git Actions
Git
Python
Rest Api
SQL
Terraform
Similar Jobs
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As a Data Engineer, you will work with GCP technologies, managing big data, utilizing ETL/ELT processes, and collaborating across teams while advancing in your career.
Top Skills:
AirflowBigQueryBigtableCloud RunDatastoreETLGCPGcsJavaPysparkPythonSpannerSparkSQL
Fintech • Financial Services
Design, build and maintain end-to-end GCP data pipelines using Python and SQL. Implement streaming and batch systems (Pub/Sub/Kafka, Dataflow, Composer), deploy services with Docker/Cloud Run and Terraform, develop/consume REST APIs, operationalize AI (Vertex AI, RAG, embeddings), implement CI/CD, monitor and provide L3 support, and collaborate using Agile tools.
Top Skills:
A2A ProtocolAgentic FrameworkApache AirflowApigeeBigQueryBitbucketCi/CdCloud ComposerCloud FunctionsCloud RunDataflowDockerEmbedding ModelsGCPGitGithub ActionsGkeGoogle AdkKafkaLlm (Gemini)Prompt EngineeringPub/SubPythonRagRest ApiSQLTerraformVector DatabaseVertex Ai
Fintech • Financial Services
Design, develop, and maintain end-to-end GCP data pipelines (batch and streaming), APIs, and AI workloads. Implement ETL/ELT, data quality, CI/CD, infrastructure as code (Terraform), monitor and troubleshoot pipelines, and support L3 operations. Work with Vertex AI, RAG/vector DBs, agentic frameworks, and deploy services via Docker/Cloud Run.
Top Skills:
Apache AirflowApigeeBigQueryBitbucketCi/CdCloud FunctionsCloud RunConfluenceDataflowDockerEmbedding ModelsGcp ComposerGeminiGitGithub ActionsGkeGoogle AdkGoogle Cloud PlatformJIRAKafkaPrompt EngineeringPub/SubPythonRagRest ApiSQLTerraformVector DatabaseVertex Ai
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

