Lead design and implementation of scalable ETL/ELT pipelines and data warehousing on cloud platforms. Optimize performance, enforce data quality and governance, enable analytics/ML, mentor engineers, automate deployments, and troubleshoot complex data systems.
Requisition Number: 2345462
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
Preferred Qualifications:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- Design and Build Data Infrastructure: Develop and maintain scalable data pipelines (ETL/ELT) for large-scale data processing
- Data Architecture: Lead architectural decisions for data warehousing and big data solutions
- Optimize Performance: Improve database performance, query efficiency, and data workflows
- Data Quality and Governance: Implement frameworks for data validation, monitoring, and compliance
- Collaboration: Work with data scientists, analysts, and business teams to enable analytics and machine learning
- Mentorship: Guide junior engineers and promote best practices in data engineering
- Automation and Deployment: Create automated testing and deployment processes for data systems
- Troubleshooting: Resolve complex data issues and ensure system reliability
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Undergraduate degree or equivalent experience
- 6+ years of experience
- Programming: Solid experience in Azure Databricks, Python/Scala, ADF, Airflow, SQL
- ETL Tools & Big Data: Hands-on experience with Spark, Hadoop, or similar frameworks
- Cloud Platforms: Azure, (or GCP) experience Containerization & Orchestration: Docker, Kubernetes
- Data Modeling & Warehousing: Expertise in relational and NoSQL databases
- Distributed Systems: Knowledge of data partitioning and parallel processing
- Data Security & Compliance: Familiarity with GDPR, HIPAA, or similar standards
Preferred Qualifications:
- Programming: Experience with Snowflake
- Streaming Technologies: Kafka or similar message queuing systems
- AI and Machine Learning Deployment: Experience in implementing AI solutions (RAG development, Agents creation), Exposure to ML model integration
- CI/CD for Data: Experience with Jenkins or other DevOps tools for data workflows
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Azure Databricks,Python,Scala,Azure Data Factory (Adf),Airflow,Sql,Spark,Hadoop,Azure,Gcp,Docker,Kubernetes,Relational Databases,Nosql,Etl,Elt,Gdpr,Hipaa
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Write and edit external-facing executive summaries, cover letters, and strategic RFP/RFI responses for high-value healthcare opportunities. Collaborate with sales, SMEs, and proposal teams, manage multiple projects, coach junior writers, maintain proposal database content, and ensure consistent, persuasive, customer-focused messaging aligned with sales strategy.
Top Skills:
MS OfficeRfpioTeams
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Build and maintain scalable data pipelines and backend services with Python and Databricks, create responsive React front ends, integrate APIs and data visualizations, collaborate with cross-functional teams, participate in CI/CD and code reviews, and support operationalization of ML models.
Top Skills:
SparkAWSAzureDatabricksGCPGitJavascript (Es6+)JSONPythonReactRestful Apis
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design, build, and maintain scalable data systems and pipelines; migrate on-prem .NET ETL to cloud (Azure Data Factory/Databricks); collaborate with cross-functional teams; automate CI/CD and ensure secure, high-quality data solutions in Agile environments.
Top Skills:
.Net.Net FrameworkAWSAzureAzure Data FactoryBig DataBusiness IntelligenceCi/CdData WarehousingDatabricksDevOpsGCPJavaKubernetesNoSQLOpenshiftOraclePl/SqlRestful Apis
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

