Fractal Logo

Fractal

Data Engineer/Senior Data Engineer- Azure DigiOps

Reposted 5 Days Ago
In-Office
Pune, Mahārāshtra, IND
Mid level
In-Office
Pune, Mahārāshtra, IND
Mid level
Responsible for L2 technical support for data platforms, advanced troubleshooting, code-level fixes, and ensuring SLA compliance. Requires strong skills in Azure Data Factory, Databricks, SQL, and Python for performance tuning and data reliability.
The summary above was generated by AI

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

Job Description: 

Need someone with strong Data Engineering skillet to ensure production (operations/support) related activities are delivered as per SLA. Need to work on issues/requests, bug fixes, minor changes, co-ordinate with the development team in case of any issues, work on enhancements.  

 

Role Details 
You will be part of the operation team providing L2 support to a client working in specified business hours or working in a 24*7 support model 

Provide Level-2 (L2) technical support for data platforms and pipelines built on Azure Data Factory (ADF), Databricks, SQL, and Python. This role involves advanced troubleshooting, root cause analysis, code-level fixes, performance tuning, and collaboration with engineering teams to ensure data reliability and SLA compliance. Must adhere to ITIL processes for Incident, Problem, and Change management. 

 
 

Key Responsibilities 

Advanced Troubleshooting & RCA 

  • Investigate complex failures in ADF pipelines, Databricks jobs, and SQL processes beyond L1 scope. 

  • Perform root cause analysis for recurring issues, document findings, and propose permanent fixes. 

  • Debug Python scripts, SQL queries, and Databricks notebooks to resolve data ingestion and transformation errors. 

  • Analyze logs, metrics, and telemetry using Azure Monitor, Log Analytics, and Databricks cluster logs. 

Code-Level Fixes & Enhancements 

  • Apply hotfixes for broken pipelines, scripts, or queries in non-production and coordinate controlled deployment to production. 

  • Optimize ADF activities, Databricks jobs, and SQL queries for performance and cost efficiency. 

  • Implement data quality checks, schema validation, and error handling improvements. 

Incident & Problem Management 

  • Handle escalated incidents from L1; ensure resolution within SLA. 

  • Create and maintain Known Error Database (KEDB) and contribute to Problem Records. 

  • Participate in Major Incident calls, provide technical insights, and lead recovery efforts when required. 

Monitoring & Automation 

  • Enhance monitoring dashboards, alerts, and auto-recovery scripts for proactive issue detection. 

  • Develop Python utilities or Databricks notebooks for automated validation and troubleshooting. 

  • Suggest improvements in observability and alert thresholds. 

Governance & Compliance 

  • Ensure all changes follow ITIL Change Management process and are properly documented. 

  • Maintain secure coding practices, manage secrets via Key Vault, and comply with data privacy regulations. 

 

 

Technical skills 
 

  • Azure Data Factory (ADF): Deep understanding of pipeline orchestration, linked services, triggers, and custom activities. 

  • Databricks: Proficient in Spark, cluster management, job optimization, and notebook debugging. 

  • SQL: Advanced query tuning, stored procedures, schema evolution, and troubleshooting. 

  • Python: Strong scripting skills for data processing, error handling, and automation. 

  • Azure Services: ADLS, Key Vault, Synapse, Log Analytics, Monitor. 

  • Familiarity with CI/CD pipelines (Azure DevOps/GitHub Actions) for data workflows. 

Non-technical skills 

  • Strong knowledge of ITIL (Incident, Problem, Change). 

  • Ability to lead technical bridges, communicate RCA, and propose permanent fixes. 

  • Excellent documentation and stakeholder communication skills. 

  • Drive Incident/Problem resolution by assisting in key operational activities in terms of delivery, fixes, and supportability with operations team. 

  • Experience working in ServiceNow is preferred. 

  • Attention to detail a must, with focus on quality and accuracy. 

  • Able to handle multiple tasks with appropriate priority and strong time management skills. 

  • Flexible about work content and enthusiastic to learn. 

  • Ability to handle concurrent tasks with appropriate priority. 

  • Strong relationship skills to work with multiple stakeholders across organizational and business boundaries at all levels. 

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Top Skills

Azure Data Factory
Azure Services
Ci/Cd Pipelines
Databricks
Python
SQL

Similar Jobs

An Hour Ago
Hybrid
Pune, Mahārāshtra, IND
Mid level
Mid level
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
The Business Coordinator will manage executive calendars, organize meetings, handle correspondence, schedule travel, and maintain confidential records while providing administrative support to senior management.
Top Skills: Microsoft Office SuiteMicrosoft PowerpointProject Management SoftwareScheduling Tools
An Hour Ago
Remote or Hybrid
India
Mid level
Mid level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Team Leader administers ADA claims, makes eligibility decisions, tracks leaves, manages interactions with specialists and updates systems for compliance and documentation.
Top Skills: Ada Administration SystemsEligibility Decision Software
An Hour Ago
Remote or Hybrid
India
Mid level
Mid level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Oversee ADA claims administration, including eligibility decisions and data collection. Manage absence and adjudication tasks, ensuring compliance and accurate records.
Top Skills: Ada (Americans With Disability Act)

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account