Design and maintain scalable data solutions in Azure, optimize data pipelines and ensure data integrity, while mentoring junior engineers.
Experience Required: 8+Years
Mode of work: Remote
Skills Required: Azure DataBricks, Eventhub, Kafka, Architecture,Azure Data Factory, Pyspark, Python, SQL, Spark
Notice Period : Immediate Joiners/ Permanent/Contract role (Can join within September 15th 2025)
- Design, develop, and maintain scalable and robust data solutions in the cloud using Apache Spark and Databricks.
- Gather and analyse data requirements from business stakeholders and identify opportunities for data-driven insights.
- Build and optimize data pipelines for data ingestion, processing, and integration using Spark and Databricks.
- Ensure data quality, integrity, and security throughout all stages of the data lifecycle.
- Collaborate with cross-functional teams to design and implement data models, schemas, and storage solutions.
- Optimize data processing and analytics performance by tuning Spark jobs and leveraging Databricks features.
- Provide technical guidance and expertise to junior data engineers and developers.
- Stay up to date with emerging trends and technologies in cloud computing, big data, and data engineering.
- Contribute to the continuous improvement of data engineering processes, tools, and best practices.
Requirements
- Bachelor’s or master’s degree in computer science, engineering, or a related field.
- 10+ years of experience as a Data Engineer, Software Engineer, or similar role, with a focus on building cloud-based data solutions.
- Strong knowledge and experience with Azure cloud platform, Databricks,EventHub,Architecture,Spark,Kafka, ETL Pipeline,Python/Pyspark, SQL EventHub, Copilot Studio..
- Strong experience with cloud platforms such as Azure .
- Experience with big data systems, including Apache Spark / Kafka
- Experience contributing to the architecture and design of large-scale distributed systems
- Expertise in Databricks Lakehouse Platform, its architecture, and its capabilities.
- Experience building production pipelines using Databricks and Azure services
- Experience with multiple coding languages such as Python or SQL.
Top Skills
Azure Data Factory
Azure Databricks
Eventhub
Kafka
Pyspark
Python
Spark
SQL
Similar Jobs
Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
Lead penetration testing and red team activities, document findings, collaborate with engineering teams, and support security incidents.
Top Skills:
GoJavaScriptPythonRuby
Artificial Intelligence • Enterprise Web • Information Technology • Productivity • Sales • Software • Database
As a Senior Backend Engineer, you'll design scalable backend endpoints, mentor team members, and improve software quality in a cross-functional team.
Top Skills:
AnsibleDockerElasticsearchKubernetesMongoDBNode.jsReactRedisReduxRubyRuby On RailsTerraform
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Associate retrieves documents, reviews notes and checklists, updates group information, checks submissions for completeness, and manages email communications related to new business cases.
Top Skills:
BizflowKit TrackSm Knowledge CenterWorkdesk
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.