How to Work With Long Term Memory In Oobabooga and Text Generation
Blog post from RunPod
Oobabooga's text generation capabilities are limited by a 2048-token context, which can lead to the bot forgetting recent details, but this limitation can be mitigated using the Long Term Memory extension by wawawario2. This extension allows Oobabooga to maintain a database of memories that can be contextually accessed based on recent inputs, enabling the bot to recall relevant information more effectively. Installation involves setting up a standard Oobabooga Text Generation UI pod on Runpod and configuring the extension through the Terminal and the Extensions tab. The interface reveals a new panel showing prior memories in the database, allowing for up to two additional memories to be loaded at once. Users can also utilize the Character pane to maintain essential details, similar to a character sheet in tabletop RPGs, ensuring that crucial information is not forgotten. Despite these enhancements, text generation models like Oobabooga still face inherent limitations in context token capacity compared to human roleplayers, but with careful management and manual input, users can significantly enhance the bot's performance.