Optum Logo

Optum

Architect - Java and Azure

Posted 2 Hours Ago
Be an Early Applicant
In-Office
Hyderabad, Telangana
Expert/Leader
In-Office
Hyderabad, Telangana
Expert/Leader
The Architect will design and own technical solutions for data-intensive applications, leveraging Java, Azure, and microservices architecture while leading a team of engineers and ensuring high availability and security compliance.
The summary above was generated by AI
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
  • Design Ownership: Own and define the end-to-end technical design for the Component/Service, ensuring it aligns with our Domain-Driven Design (DDD) and event-driven principles while working with application architects
  • System Design: Design and document data models, API contracts (REST), and event schemas. Create and maintain architectural diagrams including state machines, component diagrams, and sequence diagrams to guide the development team
  • Ownership of the code repositories : Making sure high test coverage of the code using Junits by covering functional scenarios is maintained. All the enterprise standards for the code quality for security vulnerabilities, readability, sonar coverage is being adhered to
  • Making sure team adheres to standard branching strategy to allow for smooth delivery of multiple features in quick succession
  • Ownership of the tech assets : Owning of the availability of the different tech assets/ micro-services required to keep the business process running. Ability to quickly respond to any service interruptions and PROD incidents and get them closed with in the SLA
  • Functional expertise : Given the nature of the products you will be working on its highly critical that you be passionate about understanding the business perspective and be an Subject matter expert in the functional understanding of the overall product offering
  • Data-Intensive Application Design: Design a solution that efficiently consumes, processes, and evaluates high-volume data streams from Apache Kafka, originating from systems like our Common Data Intake service and other domain services in collaboration with architects
  • Database Strategy: Lead the design and implementation of our database strategy using MongoDB Atlas on Azure. This includes schema design, indexing strategies, and leveraging advanced features like the Aggregation Framework for real-time analytics and Atlas Search for fuzzy matching capabilities
  • Technical Leadership & Mentorship: Guide and mentor a team of talented engineers on best practices for building scalable microservices using Java, Spring Boot, and Kafka. Provide hands-on guidance where needed
  • Cross-Functional Collaboration: Work closely with other architects, product owners, and business stakeholders to translate complex business requirements into a robust, secure, and maintainable technical solution
  • Ensure Non-Functional Requirements: Design for scalability, high availability, data security (HIPAA compliance), and performance. Define SLOs/SLIs and ensure the system is instrumented for effective monitoring and alerting
  • Legacy Integration: Understand the existing data landscape, including data stores like Hive and processing jobs in Spark, to ensure seamless integration and migration paths
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:
  • 10+ years of professional software engineering experience with a focus on building enterprise-scale, backend systems
  • 4+ years experience with a proven track record of designing and delivering complex software projects
  • Event-driven Architecture: Proven experience designing and implementing solutions using Apache Kafka for high-throughput, real-time data streaming
  • Deep expertise in the Java Ecosystem: Mastery of Java and the Spring Boot framework for building RESTful APIs and microservices
  • Expert Database Design Skills:
    • Solid hands-on experience with NoSQL document databases, particularly MongoDB, including schema design and performance tuning
    • Data Processing Knowledge: Experience working within a modern data ecosystem that includes technologies like Apache Spark and Hive
    • API Design Mastery: Extensive experience designing, developing, and documenting secure, scalable REST APIs
    • Microsoft Azure Knowledge: Experience working with MS Azure to provide cloud services for infrastructure, computing and database
    • Solid foundation in relational databases (SQL) and understanding of when to use each

Preferred Qualifications:
  • Search Engine Experience: Hands-on experience with Elasticsearch or, even better, MongoDB Atlas Search for implementing complex search and fuzzy matching logic
  • Domain-Driven Design (DDD): Practical experience applying DDD principles (Bounded Contexts, Aggregates, Events) to build maintainable and business-aligned software
  • Cloud Platform Expertise: Experience architecting and deploying applications on a major cloud provider, with a solid preference for Microsoft Azure
  • CI/CD and DevOps: Experience working in a mature DevOps environment with CI/CD pipelines (eg, Jenkins, Azure DevOps), containerization (Docker, Kubernetes), and Infrastructure as Code (Terraform)
  • System Observability: Experience with monitoring and observability stacks such as Prometheus, Grafana, or Dynatrace
  • Healthcare Domain Knowledge: Familiarity with the healthcare industry, particularly related to risk adjustment (HCCs), quality measures (HEDIS), and PHI/HIPAA compliance

At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Top Skills

Apache Kafka
Azure Devops
Docker
Java
Jenkins
Kubernetes
Azure
MongoDB
Spring Boot
SQL

Similar Jobs at Optum

2 Hours Ago
In-Office
Hyderabad, Telangana, IND
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Data Engineering Consultant is responsible for architecting and securing MSSQL environments, leading migrations, and optimizing performance while ensuring compliance and team mentorship.
Top Skills: AWSAzureGCPMicrosoft Sql ServerPythonTerraform
2 Hours Ago
In-Office
Hyderabad, Telangana, IND
Mid level
Mid level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Director of Data Engineering will design and maintain database systems, manage Oracle and MySQL databases, perform database tuning, and support the production environment for optimal performance.
Top Skills: AddmAwrAzureCassandraExadataHp UnixLinuxMySQLOracle 19COracle Enterprise Manager 13COracle Golden GatePl/SqlPostgresSql Loader
2 Hours Ago
In-Office
2 Locations
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Data Analyst role involves running and modifying SAS programs, analyzing code, and ensuring successful execution in finance, banking or healthcare domains.
Top Skills: DatabricksGitPythonSASSas Enterprise GuideSas MacrosSas SqlSnowflakeUnix

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account