Design and build automated data pipelines, process data using Databricks, optimize performance, and ensure data governance in a data engineering role.
Requisition Number: 2350301
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- Designing & building automated data pipelines to move data from various source systems using tools like Azure Data Factory (ADF) for orchestration, and Databricks with PySpark for large-scale data processing and transformation
- Use Databricks (leveraging Apache Spark) to cleanse, transform, and enrich large, complex, and unstructured datasets
- Load finalized, structured data into Snowflake for high-performance analytics, business intelligence (BI), and reporting
- Performance Optimization: Continuously monitor and tune system performance, optimize Spark jobs and SQL queries in Snowflake, and manage compute resources efficiently to balance performance and cost
- Data Modeling and Architecture: Design logical and physical data models (e.g., star/snowflake schemas) to ensure data is structured for efficient querying and analysis
- Data Governance and Security: Implement data security measures (encryption, role-based access control), ensure data quality through validation rules, and track data lineage
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Undergraduate degree or equivalent experience
- Working knowledge of Power BI
- Working knowledge of Azure/AWS containerized services and solutions
- Working knowledge of building automated quality solutions
- Working knowledge of developing & integrating Gen AI and Agentic AI data solutions
- Implementation knowledge of AI capabilities in Databricks / Snowflake
- Proficiency in below services:
- Azure Services: Azure Data Factory (ADF), Azure Data Lake Storage (ADLS)
- Databricks: Apache Spark (PySpark), Delta Lake, Unity Catalog, Delta Live Tables, MLflow, and Databricks Workflows/Jobs for orchestration
- Snowflake: Snowflake SQL, Snowpipe, Streams, Tasks, data modeling (star/snowflake schemas), and performance optimization features (clustering keys)
- Programming Languages: Solid proficiency in Scala/PySpark and SQL
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Ai Capabilities
Azure Data Factory
Azure Services
Databricks
Delta Lake
Mlflow
Power BI
Pyspark
Scala
Snowflake
Snowpipe
SQL
Streams
Tasks
Unity Catalog
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The data engineering consultant will develop data pipelines, work with cloud applications, and ensure compliance with security practices while collaborating in an Agile environment.
Top Skills:
AzureDatabricksGenaiGitPl/SqlPostgresScalaSparkSQL
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead architecture, migration, optimization, and security of enterprise MSSQL databases across on-premises and multi-cloud. Drive HA/DR, performance tuning, automation (Python/PowerShell/T-SQL), Terraform IaC, AI-driven optimization, incident response, and mentoring of DBA/data engineering teams while ensuring compliance and lifecycle governance.
Top Skills:
Ai/MlAlways On Availability GroupsAWSAws Rds/AuroraAzureAzure SqlCi/CdClusteringGCPGcp Cloud SqlLog ShippingMicrosoft Sql Server (Mssql)MySQLNoSQLOraclePostgresPowershellPythonReplicationT-SqlTerraform
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Claims Associate will understand healthcare plan concepts, assist in projects, and ensure compliance with company policies, while working with attention to detail and analytical skills.
Top Skills:
ExcelMs Word
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

