IPG Mediabrands Logo

IPG Mediabrands

Data Engineer

Reposted 24 Days Ago
In-Office
Pune, Mahārāshtra
Mid level
In-Office
Pune, Mahārāshtra
Mid level
As a Data Engineer, you will manage data infrastructure projects, optimizing data architecture and pipeline, and supporting data needs across teams.
The summary above was generated by AI

Business Overview 

IPG Mediabrands is the media and marketing solutions division of Interpublic Group (NYSE: IPG). IPG Mediabrands manages over $47 billion in marketing investment globally on behalf of its clients across its full-service agency networks UM, Initiative and Mediahub and through its award-winning specialty business units Healix, Kinesso, MAGNA, Mediabrands Content Studio, Orion Holdings, Rapport, and the IPG Media Lab. IPG Mediabrands clients include many of the world’s most recognizable and iconic brands from a broad portfolio of industry sectors including automotive, personal finance, consumer product goods (CPG), pharma, health and wellness, entertainment, financial services, energy, toys and gaming, direct to consumer and e-commerce, retail, hospitality, food and beverage, fashion and beauty. The company employs more than 18,000 diverse marketing communication professionals in more than 130 countries. Learn more at www.ipgmediabrands.com. 

Position Summary

We are looking for experienced Data Engineer to manage in-progress and upcoming data infrastructure projects. The candidate will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder using Python and data wrangler who enjoys optimizing data systems and building them from the ground up. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products.

Responsibilities
  • Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements using Python and SQL / AWS / Snowflakes.
  • Identify, design, and implement internal process improvements through automating manual processes using Python, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL / AWS / Snowflakes technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  •  Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. 
  • Work with data and analytics experts to strive for greater functionality in our data systems

Required Skills & Experience
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. 
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. 
  • Strong analytic skills related to working with unstructured datasets. 
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management. 
  • A successful history of manipulating, processing and extracting value from large, disconnected datasets.

Desired Skills & Experience
  • 3+ years of experience in a Python Scripting and Data specific role, with bachelor's degree. 
  • Experience with data processing and cleaning libraries e.g. Pandas, numpy, etc., web scraping/ web crawling for automation of processes, API’s and how they work. 
  • Debugging code if it fails and find the solution. Should have basic knowledge of SQL server job activity monitoring and of Snowflake. 
  • Experience with relational SQL and NoSQL databases, including PostgreSQL and Cassandra. 
  • Experience with most or all the following cloud services: AWS, Azure, Snowflake, Google Strong project management and organizational skills. 
  • Experience supporting and working with cross-functional teams in a dynamic environment.
We See You  

At IPG Mediabrands, we are unified behind a commitment to fostering a culture of inclusion and belonging. Together, we shine through a set of shared values and behaviors. We take pride in our responsibility to our clients, communities, and to each other. We embrace differences and recognize the unique value that each of us brings to our community.  

We encourage you to apply, as unique backgrounds, perspectives, and lived experiences are welcomed.   

We See You at IPG Mediabrands. 

About IPG Mediabrands 

IPG Mediabrands is the media and marketing solutions division of Interpublic Group (NYSE: IPG). IPG Mediabrands manages over $47 billion in marketing investment globally on behalf of its clients across its full-service agency networks UM, Initiative and Mediahub and through its award-winning specialty business units Healix, Kinesso, MAGNA, Mediabrands Content Studio, Orion Holdings, Rapport, and the IPG Media Lab. IPG Mediabrands clients include many of the world’s most recognizable and iconic brands from a broad portfolio of industry sectors including automotive, personal finance, consumer product goods (CPG), pharma, health and wellness, entertainment, financial services, energy, toys and gaming, direct to consumer and e-commerce, retail, hospitality, food and beverage, fashion and beauty. The company employs more than 18,000 diverse marketing communication professionals in more than 130 countries. Learn more at www.ipgmediabrands.com. 

Top Skills

AWS
Cassandra
Numpy
Pandas
Postgres
Python
Snowflake
SQL

Similar Jobs

2 Days Ago
Hybrid
Pune, Mahārāshtra, IND
Mid level
Mid level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
The Software Engineer II will develop data-driven solutions for sustainability, focusing on energy consumption and emission reduction data pipelines and APIs. Responsibilities include data model engineering, ensuring data integrity, and collaborating across teams to improve systems and insights.
Top Skills: Cloud-Native TechnologiesPower AppsSharepointSplunkSQL
4 Hours Ago
In-Office
Pune, Mahārāshtra, IND
Mid level
Mid level
Information Technology
As a Data Engineer, design and maintain data solutions by creating data pipelines and ensuring data quality, while implementing ETL processes. Collaborate with teams and contribute solutions to enhance data management.
Top Skills: DatabricksMicrosoft Azure Data ServicesPysparkPythonSQL
Yesterday
In-Office
Pune, Mahārāshtra, IND
Senior level
Senior level
Information Technology
Design, develop, and maintain data solutions including data generation and processing. Create data pipelines, ensure data quality, and implement ETL processes.
Top Skills: ETLSnowflake Data WarehouseSQL

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account