Nielsen Logo

Nielsen

Senior Audience & Consumer Insights Specialist I [Sr Data Analyst | Python | Git]

Posted Yesterday
Be an Early Applicant
Hybrid
Bangalore, Bengaluru, Karnataka
Senior level
Hybrid
Bangalore, Bengaluru, Karnataka
Senior level
The Senior Audience & Consumer Insights Specialist manages data projects, ensures data integrity, troubleshoots pipeline issues, and collaborates with Product and Engineering teams to optimize data workflows.
The summary above was generated by AI
Company Description

At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future.

Job Description

Role Overview

In this position, you contribute to the successful implementation of our planning software solutions for advertisers, media owners and media agencies. Our solutions help these companies in making decisions around their marketing communication budgets. Specifically, this role would be working on Nielsen Media Impact (NMI) and Nielsen One Planning (N1P).

As a Technical Data Operations Senior Analyst, you will serve as the technical lead for data integrity and delivery of new data projects. This role requires a high degree of autonomy and a detective mindset; you are expected to go beyond surface-level monitoring and execution, by digging into the data, running independent queries, and diagnosing root causes before escalating to engineering teams. In addition to delivering new data projects, you will also be able to troubleshoot the end-to-end lifecycle of data loading, quality assurance, and software releases to ensure our clients receive timely, accurate insights.

Qualifications

Key Responsibilities

  • Take full ownership of new data projects, implementing the addition of new data sources into our software.
  • Partner with Product and Engineering teams to support the implementation of new features while identifying opportunities to automate manual update processes to increase overall operational efficiency.
  • Act as a primary technical consultant to Product and cross-functional partners, demystify previously engineered pipelines across the board, design and recommend scalable engineering solutions to commercial use-cases in line with our long-term goals.
  • Contribute to the design and implementation of pipeline automations, ensuring high quality and on-time deliveries of data.
  • Manage the comprehensive end-to-end monthly update process, from initial scheduling and monitoring data intake from upstream teams to ensuring final delivery meets all client requirements and deadlines.
  • Supervise automated data processing via Airflow, independently running DAGs and resolving pipeline failures to ensure continuous, ensuring that data deliveries are meeting rigorous QA standards.
  • Take full ownership of data quality processes by assessing and designing tests and performing deep-dive investigations into any test or QA process where gaps are identified or anticipated.
  • Independently diagnose technical issues by running manual queries in tools like Athena and replicating API requests and responses in Google Cloud Platform, verifying data paths in configuration files and validating downstream services to determine if issues stem from source data, pipeline logic, or respondent-level discrepancies like weights and fusion data.
  • Monitor bi-weekly production releases by managing the push of release candidates and independently troubleshooting any red test results to ensure total software stability post-update.
  • Serve as one of the primary technical points of contact for downstream teams, assessing criticality and troubleshooting software bugs or crashes before coordinating with expert teams for complex escalations.
  • Assess Severity 1 incidents, ensuring all stakeholders are provided with clear root-cause analysis, impact assessments, and resolution timelines.
  • Align with QA and Project Leaders to design new test parameters for product launches, ensuring rigorous verification between NMI data and check-files.
  • Contribute to the maintenance of accurate project and technical documentation to ensure historical knowledge is preserved and reporting is consistent across the product.

Additional Information

Behavioral Skills:

  • Demonstrate a disciplined approach to data validation by proactively investigating anomalies and inconsistencies beyond surface-level reporting to ensure absolute data integrity.
  • Exhibit the technical proficiency required to conduct comprehensive root-cause analysis and formulate strategic solutions independently prior to seeking escalation to senior engineering resources.
  • Assume full responsibility for operational delivery milestones, ensuring that any disruption to data feeds is identified immediately and communicated with appropriate urgency via formal severity protocols.
  • Possess the ability to synthesize complex technical information and operational blockers into concise, actionable status updates for both internal leadership and external client stakeholders.
  • Maintain a steadfast commitment to accuracy when managing market-specific configurations and reviewing metrics to mitigate operational risk and ensure platform stability.
  • Actively monitor the health of the end-to-end data lifecycle, demonstrating the initiative to follow up on upstream dependencies and identify process improvements without external prompting.

Technical Skills

  • Bachelor’s degree in Computer Science, Information Technology, or Data Analytics (B.E. / B.Tech / BCA).
  • AWS Certified Cloud Practitioner or certifications in Airflow/Git a Plus.
  • 4+ years in a technical role specifically handling data pipelines, ETL monitoring, or Production Support.
  • Proficiency in monitoring and managing data workflows using Apache Airflow.
  • Strong version control skills using Git/GitLab to manage configurations and repositories.
  • Ability to write and execute SQL queries or Python scripts to independently validate data and troubleshoot issues.
  • Experience with automated testing frameworks and the ability to interpret technical logs.
  • Familiarity with software release cycles, deployment environments, and production protocols (CI/CD).
  • Demonstrable experience working with and within a SaaS (Software as a Service) environment.
  • Proficiency with modern collaboration tools, including Jira, Confluence, and Google Suite.
  • A track record of working effectively and collaboratively in geographically distributed teams.

Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels.

Top Skills

Airflow
AWS
Confluence
Git
Google Cloud Platform
JIRA
Python
SQL

Similar Jobs

An Hour Ago
Hybrid
Bangalore, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Design and develop features for a Linux-based sensor at CrowdStrike. Enhance security for cloud workloads and contribute to Agile development processes.
Top Skills: CC++EbpfKubernetesLinux
An Hour Ago
Hybrid
Bengaluru, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
Lead the DevOps team supporting multiple services, enhance automation, manage cloud/on-premises servers, and contribute to best practices and team culture.
Top Skills: AnsibleBashCentosDockerGitJenkinsKubernetesLinuxRedhatTerraform
An Hour Ago
Hybrid
2 Locations
Senior level
Senior level
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
This role involves designing and implementing solutions in OneStream for enterprise performance management, leading requirements gathering, and overseeing system administration and reporting.
Top Skills: CpmEpmEtl ToolsOnestreamOracle FusionPeoplesoftSQLUnix/LinuxVb.NetWorkday

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account