Hippocratic AI, led by CEO Munjal Shah, has partnered with Nvidia to develop conversational AI healthcare agents with enhanced response times. The collaboration aims to integrate super-low-latency response capabilities into Hippocratic AI’s large language models (LLMs) to make interactions more conversational and empathetic. By leveraging Nvidia’s advanced technology stack, including LLM inference, speech synthesis, and voice recognition software, the company seeks to improve the speed of AI responses, which is crucial for patient interactions. User tests have demonstrated that reducing response times significantly enhances patients’ emotional engagement with AI agents, with notable improvements in perceived empathy and comfort levels.
The partnership also focuses on ensuring the safety and reliability of AI healthcare agents. Hippocratic AI has conducted tests showing its LLMs’ superior performance in critical safety tasks compared to other LLMs and human healthcare workers. The collaboration utilizes Nvidia’s low-inference stack, H100 Tensor Core GPUs, and NIM microservices to support the development and deployment of AI agents. Hippocratic AI aims to address healthcare industry challenges, such as staffing shortages and access to quality care, while improving patient outcomes. The initiative highlights the potential of voice-based digital agents powered by generative AI to transform healthcare by providing human-like interactions.