Cognyte Logo

Cognyte

Data Engineer

Posted 2 Days Ago
Be an Early Applicant
Remote or Hybrid
Hiring Remotely in Pune, Maharashtra, IND
Senior level
Remote or Hybrid
Hiring Remotely in Pune, Maharashtra, IND
Senior level
As a Data Engineer, you will design, build, and maintain data infrastructure and pipelines, support machine learning workflows, and optimize database systems.
The summary above was generated by AI
Description

Today’s world is crime-riddled. Criminals are everywhere, invisible, virtual and sophisticated. Traditional ways to prevent and investigate crime and terror are no longer enough… 

Technology is changing incredibly fast. The criminals know it, and they are taking advantage. We know it too.  

For nearly 30 years, the incredible minds at Cognyte around the world have worked closely together and put their expertise to work, to keep up with constantly evolving technological and criminal trends, and help make the world a safer place with leading investigative analytics software solutions. 

We are defined by our dedication to doing good and this translates to business success, meaningful work friendships, a can-do attitude, and deep curiosity.

If you want to be part of a powerful team, we are looking for a talented Data Engineer to join our research group at Cognyte. As a Data Engineer, you will be responsible for designing, building, and maintaining data infrastructure and pipelines to support the development and deployment of advanced machine learning models. This is an excellent opportunity professionals who are passionate about data engineering and want to contribute to cutting-edge AI technologies.

As a Cognyter you will:

  • Collaborate with senior data engineers, data scientists and research scientists to design, develop, and maintain data pipelines and infrastructure to support machine learning workflows.
  • Implement data storage solutions and optimize database systems for efficient data retrieval and storage.
  • Work closely within the research team to integrate machine learning models into data processing pipelines, ensuring data quality and consistency.
  • Contribute to the development and maintenance of data documentation, including data dictionaries and metadata.
  • Support data-related needs of the research team, such as data analysis, troubleshooting, and performance optimization.
  • Stay up to date with the latest data engineering technologies and practices, and propose innovative solutions to improve data processes and workflows.
Requirements

Your toolbox

  • BSc or MSc degree in computer science, Statistics, Informatics, Information Systems or related fields.
  • 5+ years in infrastructure / data engineering / platform engineering roles (or equivalent experience).
  • Strong hands-on experience with Linux (administration, troubleshooting, performance analysis).
  • Strong coding skills in Python & SQL (automation, tooling, pipelines)
  • Experience with big data technologies and distributed processing such as Spark, Hadoop ecosystem, Kafka etc.
  • Experience in designing and maintaining data storage solutions: relational + NoSQL systems, data modeling, schemas, metadata, ETL processes and performance considerations.
  • Strong collaboration skills with demonstrated ability to work effectively across teams.
  • Strong problem-solving skills and the ability to handle and analyze data efficiently.
  • Excellent communication skills.
  • Passion for data engineering and a desire to develop expertise in machine learning and AI technologies.
  • Willingness to travel abroad (10%).

Top Skills

Hadoop
Kafka
Linux
Python
Spark
SQL

Similar Jobs

Yesterday
Remote
India
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The role involves analyzing large datasets, developing AI models, mentoring junior staff, and creating data visualizations to drive business decisions.
Top Skills: HadoopNumpyPandasPythonRScikit-LearnSparkSQLTensorFlow
6 Days Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The role involves designing and maintaining ETL/ELT pipelines on Azure, ensuring data quality and compliance, and mentoring other data engineers.
Top Skills: SparkAzure Data FactoryAzure FunctionsCosmos DbDatabricksEventhubHadoopHbaseHiveNoSQLPythonScalaSQLSynapse
2 Days Ago
Remote
India
Senior level
Senior level
Analytics • Business Intelligence • Consulting
Exusia seeks a Snowflake Data Engineer to develop large-scale data pipelines and models, collaborate with stakeholders, and ensure data accuracy.
Top Skills: AirflowAWSAzureDbtGoogle Cloud PlatformPythonSnowflakeSQL

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account