Lead design and implementation of scalable ETL/ELT pipelines and data warehousing on cloud platforms. Optimize performance, enforce data quality and governance, enable analytics/ML, mentor engineers, automate deployments, and troubleshoot complex data systems.
Requisition Number: 2345462
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
Preferred Qualifications:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- Design and Build Data Infrastructure: Develop and maintain scalable data pipelines (ETL/ELT) for large-scale data processing
- Data Architecture: Lead architectural decisions for data warehousing and big data solutions
- Optimize Performance: Improve database performance, query efficiency, and data workflows
- Data Quality and Governance: Implement frameworks for data validation, monitoring, and compliance
- Collaboration: Work with data scientists, analysts, and business teams to enable analytics and machine learning
- Mentorship: Guide junior engineers and promote best practices in data engineering
- Automation and Deployment: Create automated testing and deployment processes for data systems
- Troubleshooting: Resolve complex data issues and ensure system reliability
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Undergraduate degree or equivalent experience
- 6+ years of experience
- Programming: Solid experience in Azure Databricks, Python/Scala, ADF, Airflow, SQL
- ETL Tools & Big Data: Hands-on experience with Spark, Hadoop, or similar frameworks
- Cloud Platforms: Azure, (or GCP) experience Containerization & Orchestration: Docker, Kubernetes
- Data Modeling & Warehousing: Expertise in relational and NoSQL databases
- Distributed Systems: Knowledge of data partitioning and parallel processing
- Data Security & Compliance: Familiarity with GDPR, HIPAA, or similar standards
Preferred Qualifications:
- Programming: Experience with Snowflake
- Streaming Technologies: Kafka or similar message queuing systems
- AI and Machine Learning Deployment: Experience in implementing AI solutions (RAG development, Agents creation), Exposure to ML model integration
- CI/CD for Data: Experience with Jenkins or other DevOps tools for data workflows
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Azure Databricks,Python,Scala,Azure Data Factory (Adf),Airflow,Sql,Spark,Hadoop,Azure,Gcp,Docker,Kubernetes,Relational Databases,Nosql,Etl,Elt,Gdpr,Hipaa
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead Azure infrastructure and architecture initiatives for accounting and finance platforms. Build scalable cloud/container environments using Terraform and GitHub Actions, implement security/compliance, set up source control/CI, maintain networking perimeter, advise on cost management, and design data environments using Microsoft Fabric, Databricks, Synapse and ADF.
Top Skills:
Azure,Terraform,Github Actions,Azure Cli,Python,Microsoft Fabric,Databricks,Azure Synapse,Azure Data Factory,Gcp,Aws,Containers,On-Prem Servers,Infrastructure As Code,Networking,Cybersecurity,Vulnerability Scanning
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Manage internal audit and risk consulting for assigned areas, lead financial and operational audits (including SOX), oversee audit teams, develop risk assessments, ensure audit quality and compliance, and support special projects.
Top Skills:
Grc Tools,Data Analytics,Tableau,Power Bi,Ai,Rpa,Risk Surveillance
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Prepare month-end close, journal entries, account reconciliations, bank reconciliations, variance analysis, fixed asset accounting, and support preparation of financial statements and notes per local GAAP and regulations.
Top Skills:
ErpExcelPeoplesoft
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

