Uniphore Logo

Uniphore

Senior Software Engineer

Posted An Hour Ago
Be an Early Applicant
In-Office
Bangalore, Bengaluru Urban, Karnataka
Senior level
In-Office
Bangalore, Bengaluru Urban, Karnataka
Senior level
Design, build, and maintain scalable cloud data platforms and AI-driven applications (RAG and agentic workflows). Implement distributed data processing (Spark/Databricks/Snowflake), unstructured data pipelines, APIs/services, CI/CD, and ensure performance, security, and reliability while collaborating with AI/ML and product teams.
The summary above was generated by AI

Uniphore is one of the largest B2B AI-native companies—decades-proven, built-for-scale and designed for the enterprise. The company drives business outcomes, across multiple industry verticals, and enables the largest global deployments.  
  
Uniphore infuses AI into every part of the enterprise that impacts the customer. We deliver the only multimodal architecture centered on customers that combines Generative AI, Knowledge AI, Emotion AI, workflow automation and a co-pilot to guide you. We understand better than anyone how to capture voice, video and text and how to analyze all types of data.  
  
As AI becomes more powerful, every part of the enterprise that impacts the customer will be disrupted. We believe the future will run on the connective tissue between people, machines and data: all in the service of creating the most human processes and experiences for customers and employees.   

Job Description:
 

Job Overview 

As a Senior Software Engineer at Uniphore, you’ll help build and evolve the data platform and AI capabilities at the heart of our product offerings. You’ll work closely with engineers, product managers, and AI/ML teams to deliver scalable, reliable, high-performance systems in the cloud—especially for unstructured data, RAG, and agentic AI workflows. 

 

Key Responsibilities 

  • Build and enhance a scalable data engineering platform across cloud providers (AWS/GCP/Azure). 

  • Design and implement distributed data applications using Spark, Databricks, and/or Snowflake. 

  • Develop and ship AI-driven applications, including RAG pipelines and agentic workflows (tool orchestration, multi-step execution). 

  • Implement robust processing for unstructured data (documents, PDFs, transcripts, chat logs), including extraction, enrichment, and indexing for downstream analytics/AI. 

  • Write clean, maintainable, and efficient code aligned with engineering best practices. 

  • Participate in the full SDLC: requirements, design, development, testing, and release. 

  • Troubleshoot, debug, and optimize existing services for performance, reliability, and scalability. 

  • Collaborate with cross-functional teams (AI/ML, product, UX) to translate business needs into technical solutions. 

  • Ensure adherence to security and data privacy standards when working with sensitive customer data. 

  • Contribute to and improve CI/CD pipelines, deployment automation, and code quality processes. 

 

Required Skills and Experience 

  • Bachelor’s/Master’s degree in Computer Science, IT, or equivalent practical experience. 

  • 3–5 years of professional software development experience. 

  • Strong proficiency in Java and Python, with experience building APIs/services. 

  • Experience with frameworks such as Spring Boot or Vert.x. 

  • Working knowledge of databases such as Postgres, MongoDB, or MySQL. 

  • Experience with one or more cloud platforms: AWS, GCP, or Azure. 

  • Strong problem-solving and debugging skills; ability to work independently and drive deliverables. 

  • Familiarity with software engineering best practices: version control, code reviews, testing/TDD. 

  • Strong written and verbal communication skills; comfortable in a fast-paced environment. 

 

Preferred Skills 

  • Hands-on experience with Spark or managed Spark (e.g., Dataproc, Databricks). 

  • Familiarity with orchestration tools such as Airflow. 

  • Experience with cloud data warehouses like Snowflake or BigQuery. 

  • Experience building/operating RAG systems (chunking, embeddings, vector search, evaluation, guardrails). 

  • Familiarity with agentic/agent frameworks for tool use, orchestration, and multi-step workflows. 

  • Experience with unstructured data processing pipelines and search/indexing patterns. 

  • Familiarity with containers and Kubernetes. 

  • Exposure to DevOps tools like Jenkins and CI/CD workflows. 

  • Basic Linux fluency. 

  • Familiarity with JavaScript/TypeScript. 

  • Prior exposure to AI/ML implementation projects. 

 


 


Location preference:

India - Bangalore


Uniphore is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.
 
For more information on how Uniphore uses AI to unify—and humanize—every enterprise experience, please visit www.uniphore.com.

Top Skills

Java,Python,Spark,Databricks,Snowflake,Spring Boot,Vert.X,Postgres,Mongodb,Mysql,Aws,Gcp,Azure,Dataproc,Airflow,Bigquery,Rag,Agent Frameworks,Vector Search,Embeddings,Kubernetes,Jenkins,Linux,Javascript,Typescript

Similar Jobs

An Hour Ago
Hybrid
Bangalore, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Design and build core backend infrastructure and microservices for ML, frontend, and platform teams. Implement distributed config, rate limiting, feature flags, A/B testing, and traffic capture/replay. Improve performance, scalability, and observability; deliver interdependent, deadline-sensitive work and own features end-to-end.
Top Skills: Python,Golang,Java,C++,Docker,Aws,Gcp,Azure
An Hour Ago
Hybrid
Bangalore, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Design, build, and own core backend infrastructure and microservices to support ML, frontend, and platform teams. Improve performance, scalability, and observability; deliver deadline-sensitive, interdependent features end-to-end and influence the infrastructure roadmap.
Top Skills: Python,Golang,Java,C++,Docker,Containers,Aws,Gcp,Azure,Microservices,Distributed Systems,Feature Flags,A/B Testing,Rate Limiting
An Hour Ago
Hybrid
Bengaluru, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Design, build, and maintain core infrastructure and microservices supporting ML, frontend, and platform teams. Improve performance, scalability, and observability; deliver deadline-sensitive, interdependent features end-to-end; own infrastructure roadmap and implement distributed configuration, rate limiting, feature flags, A/B testing, and traffic capture/replay.
Top Skills: Python,Golang,Java,C++,Docker,Aws,Gcp,Azure

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account