Optum Logo

Optum

Senior Data Engineering Consultant- ADB, ADF, Python and SQL

Posted 4 Hours Ago
Be an Early Applicant
In-Office
Bangalore, Bengaluru Urban, Karnataka
Senior level
In-Office
Bangalore, Bengaluru Urban, Karnataka
Senior level
Design, develop, test, and document complex ETL and Spark jobs on Azure Databricks; work with ADF, Airflow, Kafka, and RDBMS; perform technical analysis, data profiling, modeling, and ensure compliance with data security standards while supporting project management and QA/testing automation.
The summary above was generated by AI
Requisition Number: 2340717
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
  • Involved in creating proper technical documentation in the work assignments
  • Understanding of the business needs and designs programs and systems that match the complex business requirements and records all the specifications that are involved in the development and coding process
  • ensures that all the standard requirements have been met and is involved in performing the technical analysis
  • Understanding of QA and testing automation process
  • Assisting the project manager by compiling information from the current systems, analyzing the program requirements, and ensuring that it meets the specified time requirements
  • Resolves moderate problems associated with the designed programs and provides technical guidance on complex programming
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:
  • Undergraduate degree or equivalent experience
  • 10+ years of experience in Spark/Scala, Azure Databricks, Azure Data Factory, Airflow and Kafka Proficient in coding, testing, implementing, debugging, and documenting the complex Spark Jobs
  • Good hands on experience on setting up Kafka streams and consuming from Kafka streams
  • Experience in RDBMS like SNOWFLAKE, SQL Server, DB2 etc.
  • Programming: Solid experience in Azure Databricks, Python/Scala, SQL
  • ETL Tools & Big Data: Hands-on experience with Spark, Hadoop, or similar frameworks
  • Cloud Platforms: Azure, (or GCP) experience
  • Containerization & Orchestration: Docker, Kubernetes experience
  • Good understanding of Spark Architecture
  • Good understanding of Data Architecture and Azure Cloud
  • Good knowledge in Unix and shell scripting
  • Good knowledge in Data Analysis, Data Profiling and Mapping
  • Data Modeling & Warehousing: Expertise in relational and NoSQL databases
  • Distributed Systems: Knowledge of data partitioning and parallel processing
  • Data Security & Compliance: Familiarity with GDPR, HIPAA, or similar standards

Preferred Qualifications:
  • Experience with Snowflake
  • Streaming Technologies: Kafka or similar message queuing systems experience
  • AI and Machine Learning Deployment: Experience in implementing AI solutions (RAG development, Agents creation), Exposure to ML model integration
  • CI/CD for Data: Jenkins or other DevOps tools for data workflows experience
  • BI Tools: Tableau, Power BI for reporting experience
  • Understanding and knowledge of Agile

At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Top Skills

Spark,Scala,Azure Databricks,Azure Data Factory,Airflow,Kafka,Snowflake,Sql Server,Db2,Python,Sql,Hadoop,Azure,Gcp,Docker,Kubernetes,Unix,Shell Scripting,Nosql,Jenkins,Tableau,Power Bi,Ml/Rag/Agents

Similar Jobs at Optum

4 Hours Ago
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design, build, and optimize scalable ETL/ELT data pipelines and data architectures on cloud platforms. Ensure data quality, governance, security compliance, automated deployments, mentoring, and troubleshooting to enable analytics and ML workloads.
Top Skills: Spark,Scala,Azure Databricks,Azure Data Factory,Airflow,Kafka,Python,Sql,Hadoop,Azure,Gcp,Docker,Kubernetes,Nosql,Snowflake,Jenkins,Tableau,Power Bi
4 Hours Ago
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
Junior
Junior
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Monitor alerts and system health, triage incidents, coordinate resolution across technical and vendor teams, maintain incident logs and reports, support compliance documentation, participate in war rooms and post-incident reviews, and use AI/Copilot to automate summaries and improve operational efficiency.
Top Skills: Genesys,Aws,Cxone,Servicenow,Splunk,Dynatrace,Microsoft Copilot,Azure,Google
4 Hours Ago
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and optimize data architectures and pipelines, develop cloud-based scalable data platforms, ensure data quality and compliance, deploy Big Data solutions, collaborate with Data Science for ML-driven features, lead projects and mentor junior engineers.
Top Skills: Sql,Python,Java,Hadoop,Spark,Aws,Azure,Gcp,Paas,Ci/Cd,Devops,Mlops,Data Warehousing,Big Data Frameworks,Etl/Elt

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account