Develop and maintain scalable data pipelines, perform data analysis, and collaborate with teams to support data-driven decisions. Mentor junior engineers.
About the Company:
Netomi is the leading agentic AI platform for enterprise customer experience. We work with the largest global brands like Delta Airlines, MetLife, MGM, United, and others to enable agentic automation at scale across the entire customer journey. Our no-code platform delivers the fastest time to market, lowest total cost of ownership, and simple, scalable management of AI agents for any CX use case. Backed by WndrCo, Y Combinator, and Index Ventures, we help enterprises drive efficiency, lower costs, and deliver higher quality customer experiences.
Want to be part of the AI revolution and transform how the world’s largest global brands do business? Join us!
About the Role:
We are looking for a Senior Data Engineer with a passion for using data to discover and solve real-world problems. You will enjoy working with rich data sets, modern business intelligence technology, and the ability to see your insights drive the features for our customers. You will also have the opportunity to contribute to the development of policies, processes, and tools to address product quality challenges in collaboration with teams.
Responsibilities
- Architect and implement scalable, secure, and reliable data pipelines using modern data platforms (e.g., Spark, Databricks, Airflow, Snowflake, etc.).
- Develop ETL/ELT processes to ingest data from various structured and unstructured sources.
- Perform Exploratory Data Analysis (EDA) to uncover trends, validate data integrity, and derive insights that inform data product development and business decisions.
- Collaborate closely with data scientists, analysts, and software engineers to design data models that support high-quality analytics and real-time insights.
- Lead data infrastructure projects including management of data on cloud platforms (AWS/Azure), data lake/warehouse implementations, and data quality frameworks.
- Ensure data governance, security, and compliance best practices are followed.
- Monitor and optimize the performance of data systems, addressing any issues proactively.
- Mentor junior data engineers and contribute to establishing best practices in data engineering standards, tooling, and development workflows.
- Stay current with emerging technologies and trends in data engineering and recommend improvements as needed.
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 8+ years of hands-on experience in data engineering or backend software development roles.
- Proficiency with Python, SQL, and at least one data pipeline orchestration tool (e.g., Apache Airflow, Luigi, Prefect).
- Strong experience with cloud-based data platforms (e.g., AWS Redshift, GCP BigQuery, Snowflake, Databricks).
- Deep understanding of data modeling, data warehousing, and distributed systems.
- Experience with big data technologies such as Apache Spark, Kafka, Hadoop, etc.
- Familiarity with DevOps practices (CI/CD, infrastructure as code, containerization with Docker/Kubernetes).
Preferred Qualifications
- Experience working with real-time data processing and streaming data architectures.
- Knowledge of data security and privacy regulations (e.g., GDPR, HIPAA).
- Exposure to machine learning pipelines or supporting data science workflows.
- Familiarity with prompt engineering and how LLM-based systems interact with data.
- Experience working in cross-functional teams and with stakeholders from non-technical domains.
Netomi is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.
Top Skills
Airflow
Apache Airflow
Aws Redshift
Databricks
Docker
Gcp Bigquery
Hadoop
Kafka
Kubernetes
Python
Snowflake
Spark
SQL
Similar Jobs
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
As a Lead Data Engineer, you'll design and implement solutions for Disability & Absence products, improve existing systems, and collaborate with teams to enhance customer experience.
Top Skills:
Big DataCi/CdHbaseHiveKafkaNoSQLPigPythonScalaShell ScriptingSolrSpark
AdTech • Marketing Tech • Software
The Senior Data Engineer will enhance data pipelines, manage data infrastructure, ensure quality and compliance, and work collaboratively with teams to support product initiatives and data needs.
Top Skills:
AirflowAWSDruidEc2LambdaPythonRdsS3SnowflakeSQL
Cloud • Fintech • Insurance • Software
Lead the development of complex software projects and data platforms, oversee architectural integrity, guide software design, and implement scalable data pipelines.
Top Skills:
AWSAzureCi/CdGCPMicrosoft DevopsSnowflakeSQLTerraform
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.



