Deutsche Bank Logo

Deutsche Bank

Associate - Data Engineer

Reposted 25 Days Ago
Be an Early Applicant
Magarpatta, Hadapsar, Pune, Maharashtra
Junior
Magarpatta, Hadapsar, Pune, Maharashtra
Junior
Develop scalable data engineering solutions and optimize data processing systems. Collaborate with team members on data analytics and integration tasks, ensuring data reliability and availability.
The summary above was generated by AI
Job Description:

Job Title: Data Engineer for Private Bank One Data Platform on Google Cloud

Corporate Title: Associate

Location: Pune, India

Role Description

  • As part of one of the internationally staffed agile teams of the Private Bank One Data Platform, you are part of the "TDI PB Germany Enterprise & Data" division. The focus here is on the development, design, and provision of different solutions in the field of data warehousing, reporting and analytics for the Private Bank to ensure that necessary data is provided for operational and analytical purposes.  
  • The PB One Data Platform is the new strategic data platform of the Private Bank and uses the Google Cloud Platform as the basis. With Google as a close partner, we are following Deutsche Bank’s cloud strategy with the aim of transferring or rebuilding a significant share of today’s on-prem applications to the Google Cloud Platform.

What we’ll offer you

As part of our flexible scheme, here are just some of the benefits that you’ll enjoy,

  • Best in class leave policy.
  • Gender neutral parental leaves
  • 100% reimbursement under childcare assistance benefit (gender neutral)
  • Sponsorship for Industry relevant certifications and education
  • Employee Assistance Program for you and your family members
  • Comprehensive Hospitalization Insurance for you and your dependents
  • Accident and Term life Insurance
  • Complementary Health screening for 35 yrs. and above

Your key responsibilities

  • Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions.
  • Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data.
  • Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling.
  • Contribute to data harmonization as well as data cleansing.
  • A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment.
  • Build solutions are highly scalable and can be operated flawlessly under high load scenarios.
  • Together with your team, you will run and develop you application self-sufficiently.
  • You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions.
  • When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development.

Your skills and experience

Mandatory Skills

  • Hands-on development work building scalable data engineering pipelines and other data engineering/modelling work using Java/Python.
  • Excellent knowledge of SQL and NOSQL databases.
  • Experience working in a fast-paced and Agile work environment.
  • Working knowledge of public cloud environment.

Preferred Skills

  • Experience in Dataflow (Apache Beam)/Cloud Functions/Cloud Run
  • Knowledge of workflow management tools such as Apache Airflow/Composer.
  • Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub).
  • Knowledge of GCS Buckets, Google Pub Sub, BigQuery
  • Knowledge about ETL processes in the Data Warehouse environment/Data Lake and how to automate them.

Nice to have

  • Knowledge of provisioning cloud resources using Terraform.
  • Knowledge of Shell Scripting.
  • Experience with Git, CI/CD pipelines, Docker, and Kubernetes.
  • Knowledge of Google Cloud Cloud Monitoring & Alerting
  • Knowledge of Cloud Run, Data Form, Cloud Spanner
  • Knowledge of Data Warehouse solution - Data Vault 2.0
  • Knowledge on NewRelic
  • Excellent analytical and conceptual thinking.
  • Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams.
  • Good communication and experience in working with distributed teams (especially Germany + India)

How we’ll support you

  • Training and development to help you excel in your career.
  • Coaching and support from experts in your team.
  • A culture of continuous learning to aid progression.
  • A range of flexible benefits that you can tailor to suit your needs.

About us and our teams

Please visit our company website for further information:

https://www.db.com/company/company.htm

We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.

Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.

We welcome applications from all people and promote a positive, fair and inclusive work environment.

Top Skills

Apache Airflow
Apache Beam
BigQuery
Cloud Functions
Data Vault 2.0
Docker
Git
Google Cloud Platform
Java
Kubernetes
NoSQL
Python
SQL
Terraform

Similar Jobs

14 Days Ago
Mumbai, Maharashtra, IND
Mid level
Mid level
Big Data • Cloud • Fintech • Financial Services • Conversational AI
The role involves developing data products, managing data pipelines, collaborating with teams for data-driven insights, and contributing to data operations.
Top Skills: APIsBigQueryEltETLFastapiGoogle Cloud PlatformPythonPython FlaskReactSnowflakeSQL
25 Days Ago
Mumbai, Maharashtra, IND
Senior level
Senior level
Big Data • Cloud • Fintech • Financial Services • Conversational AI
The Enterprise Data Platform Engineer engages in refining requirements, developing and maintaining data platforms, automating data pipelines, and enhancing performance.
Top Skills: AirflowAzureDbtKafkaPythonSnowflakeSQL
25 Days Ago
Mumbai, Maharashtra, IND
Senior level
Senior level
Fintech • Financial Services
Lead and manage a team of Observability Analytics Engineers, ensuring timely project delivery, high-quality data solutions, and effective stakeholder communication.
Top Skills: FlinkJavaJavaScriptPower BIPythonSnowflakeSparkSplunkSQLTableau

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account