Kroll Logo

Kroll

Lead, Data Engineer

Posted 5 Hours Ago
Be an Early Applicant
Remote or Hybrid
Hiring Remotely in India
Senior level
Remote or Hybrid
Hiring Remotely in India
Senior level
Design and build data infrastructure and architecture, optimize data processes, implement ELT applications, and develop high-quality data pipelines for clients.
The summary above was generated by AI

Our professionals balance analytical skills, deep market insight and independence to deliver solid, defensible analysis and practical advice to our clients. As an organization, we think globally. We create transparency in an opaque world, and we encourage our people to do the same. That means when you take your place on our team, you’ll discover a supportive and collaborative work environment that empowers you to excel. If you’re ready to share your perspective with the world, then you can make a real impact here. This is the Kroll difference. 

Kroll is building an strong Data practice with artificial intelligence, machine learning practice and analytics, and we’re looking for you to join our growing portfolio of . You will be involved in designing, building, and integrating data from various sources and working with an advanced engineering team and professionals from the world’s largest financial institutions, law enforcement, and government agencies.

At Kroll, your work will help protect, restore and maximize value for our clients. Join us and together we’ll maximize the value of your career.

RESPONSIBILITIES:

  • Design and build organizational data infrastructure and architecture

  • Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes for data delivery.

  • Choose the best tools/services/resources to build robust data pipelines for data ingestion, connection, transformation, and distribution
  • Design, develop and manage ELT applications.
  • Working with global teams to deliver fault tolerant, high quality data pipelines 

REQUIREMENTS:

  • Advanced Experience writing ETL/ELT jobs
  • Advanced Experience with Azure, AWS and Databricks Platform (Mostly data related services)
  • Advanced Experience with Python, Spark ecosystem (PySpark + Spark SQL), SQL database
  • Ability to develop REST APIs, Python SDKs or Libraries, Spark Jobs, etc
  • Proficiency in using open-source tools, frameworks, python libraries like FastAPI, Pydantic, Polars, Pandas, PySpark, Deltalake Tables, Docker, Kubernetes, etc
  • Experience in Lakehouse & Medallion architecture, Data Governance, Data Pipeline Orchestration
  • Excellent communication skills
  • Ability to conduct data profiling, cataloging, and mapping for technical data flows
  • Ability to work with an international team

DESIRED SKILLS:

  • Strong cloud architecture principles: compute, storage, networks, security, cost savings, etc.
  • Advanced SQL and Spark query/data pipeline performance tuning skills.
  • Experience and knowledge of building Lakehouse using technologies including, Azure Databricks, Azure Data Lake, SQL, PySpark etc.
  • Programing paradigm like OOPPs, Async programming, Batch processing
  • Knowledge of CI/CD, Git

#LI-AT1

#LI-Remote

Top Skills

AWS
Azure
Databricks
Docker
Fastapi
Kubernetes
Pandas
Polars
Pydantic
Pyspark
Python
Spark
Spark Sql
SQL

Similar Jobs

13 Days Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Lead development and maintenance of Big Data solutions for Disability & Absence products, ensuring high-quality, efficient, and scalable applications.
Top Skills: AzureGCPHadoopHbaseHiveIn-Memory Data ProcessingKafkaNifiNoSQLPigPythonScalaShell ScriptSolrSpark
10 Days Ago
Remote or Hybrid
Senior level
Senior level
Cloud • Software
Lead SRE responsible for designing, building, and optimizing cloud and big-data infrastructure to ensure availability, scalability, and security of ML/AI systems. Provide technical leadership, mentor teams, troubleshoot production issues, drive automation, and define the platform roadmap while collaborating with cross-functional stakeholders.
Top Skills: AirflowAlertmanagerAWSCloudwatchEksElkEmrGoGobblinGrafanaHadoopHdfsHiveKubernetesLinuxOpentelemetryPrometheusPythonSagemakerSparkTerraformThanos
3 Days Ago
In-Office or Remote
Shri Bhrigukshetra, BLR, Uttar Pradesh, IND
Senior level
Senior level
Fintech • Analytics
The Lead Cloud Data Engineer manages the design and implementation of data solutions using Azure technologies, leads a team, and ensures efficient data processing within the organization.
Top Skills: AzureAzure Blob StorageAzure Data FactoryAzure DatabricksAzure Stream AnalyticsAzure Synapse AnalyticsFabricPurviewPysparkPythonRest ApisSQLTerraform

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account