Design and develop cloud-native ETL data pipelines on Azure and Databricks, optimizing for performance and data quality while collaborating with teams to meet complex data requirements.
Requisition Number: 2355732
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
Preferred Qualification:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- Contribute hands-on to the design, ETL development, and operation of cloud-native data pipelines on Azure and Databricks
- Design and optimize Spark-based ETL frameworks for large-scale batch and incremental processing.
- Implement complex transformations and performance tuning using Python and Spark SQL.
- Modernize on-prem ETL workloads to Databricks/snowflake, ensuring enterprise standards for scalability, reliability, and governance.
- Collaborate with cross-functional teams to understand complex data requirements and deliver tailored solutions
- Ensure data quality and integrity by implementing robust data validation and monitoring processes
- Develop comprehensive documentation for data engineering processes and systems
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Bachelor's degree (B.E./B.Tech) in any engineering or related discipline
- Hands-on experience with Databricks and cloud-native data platforms, preferably on Azure
- Solid experience building ETL pipelines using Python, PySpark, and Spark SQL
- Experience with ETL tools (e.g., DataStage) and large-scale data warehouse implementations
- Experience migrating or modernizing on-prem ETL workloads to cloud architectures
- Solid understanding of data warehousing concepts, including dimensional modeling, data quality, reconciliation, and performance tuning
- Familiarity with cloud ecosystems (Azure/AWS/GCP) and workflow orchestration tools such as Airflow
- Proficient in SQL and working with relational and distributed database systems
- Proven solid analytical and problem-solving skills with the ability to deliver scalable, reliable data solutions
Preferred Qualification:
- Demonstrated ability to collaborate effectively across teams; leadership or mentoring experience
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Airflow
Azure
Databricks
Datastage
ETL
Pyspark
Python
Spark Sql
SQL
Optum Pune, Maharashtra, IND Office
Pune, India, India
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Associate DevOps Engineer is responsible for designing and operating cloud infrastructure, managing CI/CD pipelines, ensuring security compliance, and providing operational support.
Top Skills:
AksAmazon SagemakerArgo CdAws BedrockAzure DevopsAzure Machine LearningAzure MonitorAzure OpenaiBashCloudwatchDockerEksElkEnvoyFluxGithub ActionsGitlab Ci/CdGitopsGrafanaHelmIstioJenkinsKongKubernetesLinkerdLinuxMlflowNginxOpensearchOpentelemetryPrometheusPythonTerraformTerragrunt
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Data Analyst will ensure compliance with company policies and analyze customer journeys using Adobe tools to improve health outcomes.
Top Skills:
Adobe Experience ManagerAdobe Journey OptimizerCustomer Journey Analytics
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Data Scientist will develop predictive models, analyze data, design data pipelines, and create reports and dashboards using Python, SQL, and Power BI.
Top Skills:
ExcelMachine LearningPower BIPysparkPythonSQL
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

