Home / Companies / Vectara / Blog / Post Details
Content Deep Dive

HHEM: expanded language support

Blog post from Vectara

Post Details
Company
Date Published
Author
Matt Gonzales and Forrest Bao
Word Count
488
Language
English
Hacker News Points
-
Summary

Vectara's HHEM has expanded its language support from three to eight, now including Portuguese, Spanish, Arabic, Chinese - Simplified, and Korean, in addition to English, German, and French, which enables the evaluation of hallucinations across a wider range of languages without the need for translation workarounds. This enhancement aligns with Vectara's goal of fostering trust in AI systems by ensuring the accuracy of large language models through HHEM, a tool designed to identify instances where these models generate content not based on their source data. The broader language support also reduces friction for global teams and enhances opportunities for collaboration, making HHEM a versatile tool for businesses dealing with multilingual environments. Additionally, improvements in the context window, expanded to 16k tokens, and reduced latency, with specific performance metrics provided, further enhance the tool's operational efficiency. These advancements reflect Vectara's commitment to empowering businesses and teams by pushing the boundaries of Retrieval-Augmented Generation applications and building AI systems that inspire confidence across various languages and industries.