McAfee Logo

McAfee

Data Curation Engineer - Remote

Posted 4 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in India
Mid level
Remote
Hiring Remotely in India
Mid level
The Data Curation Engineer will develop high-quality curated data products, collaborate with data stewards, ensure compliance with governance, and handle data documentation. Responsibilities include implementing data curation specifications, utilizing SQL/SparkSQL, and supporting Agile methodologies.
The summary above was generated by AI

Role Overview:

Job Description Summary
We are seeking a seasoned data engineer to join our Enterprise Analytics team as a Data Curation Engineer, reporting to the Capability Lead – BI & Data Curation. As a key member of McAfee’s enterprise data curation strategy, you will be responsible for developing high-quality, certified, and curated data in the Databricks platform employing complex analytical modeling.
This is a Remote position located in India. We are only considering candidates located in India and are not providing relocation at this time.

About the role:

  • Develop high-quality, curated data products including tables, schemas, and aggregates via SQL/SparkSQL
  • Understand data product requirements and liaise with Technical Data Stewards and business end-users to ensure alignment to business need
  • Ensure data products meet curation specifications, guidelines, governance standards, and business requirements in tandem with Lakehouse medallion architecture
  • Experience with data catalogs such as Unity and Collibra
  • Follow Agile sprint methodology
  • Implement change requests to production data curations
  • Ensure effective documentation and cataloging of data products
  • Ensure clear handoffs of data products to technical stewards

About you: 

  • We Prefer a Bachelor’s degree in computer science, engineering, statistics, or related field
  • Bring 3+ years of experience in data engineering, data modeling, and ETL/ELT
  • Have hands-on experience with and understanding of the following:
    • SQL/SparkSQL
    • Python or other programming language
    • Databricks or other cloud data management platform
    • Data aggregation, relational database modeling, and common data structures
    • Design and implementation of logical and physical data modeling
    • ETL/ELT tools such as Apache Airflow
    • Data quality guidelines
    • Data governance and data curation models
    • Master data management
    • Data profiling
    • Bring proven experience in development within a Lakehouse data platform

#LI-Remote


Company Overview

McAfee is a leader in personal security for consumers. Focused on protecting people, not just devices, McAfee consumer solutions adapt to users’ needs in an always online world, empowering them to live securely through integrated, intuitive solutions that protects their families and communities with the right security at the right moment.

Company Benefits and Perks:

We work hard to embrace diversity and inclusion and encourage everyone at McAfee to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees.

  • Bonus Program
  • Pension and Retirement Plans
  • Medical, Dental and Vision Coverage
  • Paid Time Off
  • Paid Parental Leave
  • Support for Community Involvement

We're serious about our commitment to diversity which is why McAfee prohibits discrimination based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.

Top Skills

Sql,Sparksql,Python

Similar Jobs

4 Days Ago
Remote
India
Mid level
Mid level
Security • Software • Cybersecurity
The Data Curation Engineer is responsible for developing high-quality curated data products using SQL and SparkSQL on the Databricks platform. They will work closely with technical data stewards and business users to ensure data products meet specified guidelines and standards, and will implement changes, document processes, and follow Agile methodologies.
Top Skills: Sql,Sparksql,Python
8 Hours Ago
Remote
Bengaluru, Karnataka, IND
Expert/Leader
Expert/Leader
Artificial Intelligence • Big Data • Cloud • Software
As a Principal Database Administrator, you will enhance performance across various RDBMS and big-data technologies. Responsibilities include database performance optimization, backup management, security protocol implementation, disaster recovery planning, and collaborative work with development teams to ensure scalability and efficiency.
Top Skills: Microsoft SqlMongoDBMySQLOraclePostgresSQL
15 Hours Ago
Remote
Hybrid
2 Locations
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
SailPoint seeks a Senior Data Engineer to design and implement robust data ingestion and processing systems. Responsibilities include developing scalable data pipelines, integrating diverse data sources, leveraging AWS services, and using tools like Apache Airflow for orchestration. Candidates should have extensive experience in data engineering and relevant technologies.
Top Skills: Apache AirflowAWSDockerFlinkKubernetesSpark

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account