dentsu Logo

dentsu

Technical Lead

Posted 25 Days Ago
Be an Early Applicant
3 Locations
Senior level
3 Locations
Senior level
The Technical Lead will design and maintain scalable ETL/ELT pipelines, manage GCP infrastructure, and ensure data solution quality.
The summary above was generated by AI
We are seeking a highly skilled and motivated Lead GCP Data Engineer to join our team. The role is critical to the development of a cutting-edge enterprise data products and solutions.
The GCP Data Engineer will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs.

Job Description:

Key Responsibilities:

Data Engineering & Development:

  • Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data.
  • Implement enterprise-level data solutions using GCP services such as BigQuery, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer.
  • Develop and optimize data architectures that support real-time and batch data processing.

Cloud Infrastructure Management:

  • Manage and deploy GCP infrastructure components to enable seamless data workflows.
  • Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices.

Collaboration and Stakeholder Engagement:

  • Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals.
  • Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation.

Quality Assurance & Optimization:

  • Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations.
  • Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines.
  • Monitor and optimize pipeline performance to meet SLAs and minimize operational costs.

Qualifications and Certifications:

  • Education:
    • Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field.
  • Experience:
    • Minimum of 5 years of experience in data engineering, with at least 3 years working on GCP cloud platforms.
    • Proven experience designing and implementing data workflows using GCP services like BigQuery, Cloud Dataflow, Dataform, Cloud Pub/Sub, and Cloud Composer.
  • Certifications:
    • Google Cloud Professional Data Engineer certification preferred.

Key Skills:

  • Mandatory Skills:
    • Advanced proficiency in Python for data pipelines and automation.
    • Strong SQL skills for querying, transforming, and analyzing large datasets.
    • Expertise in GCP services such as BigQuery, Cloud Functions, DBT, Cloud Storage, Dataflow, and Kubernetes (GKE).
    • Hands-on experience with CI/CD tools such as Jenkins, Git, or Bitbucket.
    • Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer.
  • Nice-to-Have Skills:
    • Experience with other cloud platforms like AWS or Azure.
    • Knowledge of data visualization tools (e.g., Looker, Tableau).
    • Understanding of machine learning workflows and their integration with data pipelines.

Soft Skills:

  • Strong problem-solving and critical-thinking abilities.
  • Excellent communication skills to collaborate with technical and non-technical stakeholders.
  • Proactive attitude towards innovation and learning.
  • Ability to work independently and as part of a collaborative team.

Location:

Mumbai

Brand:

Merkle

Time Type:

Full time

Contract Type:

Permanent

Top Skills

Apache Airflow
BigQuery
Bitbucket
Ci/Cd
Cloud Composer
Cloud Functions
Cloud Pub/Sub
Cloud Storage
Dataflow
GCP
Git
Jenkins
Python
SQL

Similar Jobs

7 Days Ago
Pune, Maharashtra, IND
Senior level
Senior level
Healthtech • Logistics • Pharmaceutical
The role involves leading development and support of Icertis CLM applications, managing teams, overseeing configurations, and ensuring product quality through testing and best practices.
Top Skills: Icertis ClmSAP
5 Days Ago
Pune, Maharashtra, IND
Expert/Leader
Expert/Leader
Software • Analytics • Hospitality
Design and develop software solutions, mentor junior developers, ensure quality through testing, and maintain documentation throughout the software lifecycle.
Top Skills: Angular15AWSGitJavaJenkinsJunitMavenMicroservicesMssqlMySQLSonarqubeSpringSpring-BootSQLTerraform
5 Days Ago
Pune, Maharashtra, IND
Senior level
Senior level
Software • Analytics • Hospitality
Designs and develops scalable software solutions, ensures adherence to best practices and quality expectations, mentors junior developers, and participates in project scoping and scheduling.
Top Skills: AntCoberturaCvsEmberGerritGitHibernateJavaJavaScriptJaxbJenkinsJmsJpaJunitMavenMongoDBMssqlMySQLPl/SqlRestSpringSpring IoSpring-BatchSvnWeb Services

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account