The Senior Data Engineering Consultant will manage the data engineering lifecycle, design scalable data solutions, optimize processes, and contribute to data governance. Requires expertise in cloud data warehousing and various engineering methodologies.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Primary Responsibilities:
- Accountable for data engineering lifecycle including research, proof of concepts, architecture, design, development, test, deployment and maintenance
- Design, develop, implement and run cross-domain, modular, flexible, scalable, secure, reliable and quality data solutions that transform data for meaningful analyses and analytics while ensuring operability
- Design, develop, implement and run data solutions that improve data efficiency, reliability and quality, and are performant by design
- Layer in instrumentation in the development process so that data pipelines can be monitored. Measurements are used to detect internal problems before they result into user visible outages or data quality issues
- Build processes and diagnostics tools to troubleshoot, maintain and optimize solutions and respond to customer and production issues
- Embrace continuous learning of engineering practices to ensure industry best practices and technology adoption, including DevOps, Cloud and Agile thinking
- Tech debt reduction/ Tech transformation including Open source adoption, Cloud adoption, HCP assessment and adoption
- Contribution to Optum Inner source / industry community and strive to reuse and share components wherever possible across organization
- Maintain high quality documentation of data definitions, transformations, and processes to ensure data governance and security
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- BTech required in computer science or similar field with minimum 10 years' experience of Data Engineering
- Experience in programming using Scala or Python
- Experience in data ingestion, preparation, integration and operationalization techniques in optimally addressing the data requirements
- Experience in Cloud data warehouse like Azure Synapse, Snowflake analytical warehouse
- Experience with Orchestration tools, Azure DevOps and GitHub
- Experience in building end to end architecture for Data Lakes, Data Warehouses and Data Marts
- Experience in relational data processing technology like MS SQL, Delta lake, Spark SQL
- Experience to own end-to-end development, including coding, testing, debugging and deployment
- Solid knowledge on data engineering solution and able to take full technical ownership of a program
- Good in learning & adopting new technologies, leverage them to execute on the use cases given to solve business problems
- Applies knowledge of principles and techniques to solve technical problems.
- Demonstrate technical capability (in writing) clearly to the use case scenario's presented during evaluation.
- Present multiple options for given problem statement.
- Proven self-starter, have can-do attitude, collaborative and adaptable.
- Hands on ETL/Pipeline Development using tools such as Azure Databricks/Apache Spark and Azure DataFactory with development expertise on batch and real-time data integration
- Extensive knowledge of ETL and Data Warehousing concepts, strategies, methodologies
- Ability to provide solutions that are forward-thinking in data and analytics
- Team oriented with solid collaboration, prioritization, and adaptability skills
- Proven excellent articulation written and verbal skills including presentation skills
- Familiarity with Azure services like Azure functions, Azure Data Lake Store, Azure Cosmos
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Spark
Azure Cosmos
Azure Data Lake Store
Azure Databricks
Azure Datafactory
Azure Devops
Azure Functions
Azure Synapse
Delta Lake
Git
Ms Sql
Python
Scala
Snowflake
Spark Sql
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Data Engineering Consultant leads the design and management of SQL Server environments, focusing on migrations, performance optimization, and mentoring teams, while ensuring security and regulatory compliance.
Top Skills:
AIAWSAzureGCPMicrosoft Sql ServerPowershellPythonTerraform
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Architecture Analyst role involves backend automation, DevOps support, developing cookbooks, and integrating backend services with front-end applications while leveraging technologies like Python and React.
Top Skills:
AngularAnsibleAzureChefCSSDjangoFastapiGCPGithub ActionsHTMLPythonReact
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The role involves using health-related information to improve healthcare services. It requires collaboration in high-performance teams to solve complex challenges.
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

