Skip to content

Meet ChatDB: A Framework that Augments LLMs with Symbolic Memory in the Form of Databases Aneesh Tickoo Artificial Intelligence Category – MarkTechPost

  • by

Large language models, like GPT-4 and PaLM 2, have evolved into a crucial part of contemporary AI systems, revolutionizing their grasp of natural language processing and changing various sectors. Despite great advancements in comprehension and producing contextually appropriate replies, LLMs still have certain drawbacks. The fact that multi-turn interactions with language models make a lot of tokens that are easily more than the input token limit of LLMs is one of the key problems. GPT-4, for instance, is limited to 32,000 tokens. The LLMs must keep contextual information during the encounter and produce replies depending on the gathered information. 

Simply concatenating all contextual information and cramming it into LLMs, however, can easily exceed the processing capabilities of LLMs and accumulate errors, causing the model to lose track of the conversation and produce less accurate responses. Some neural memory mechanisms have been explored to overcome LLMs’ limited token input issue. The memory components serve as a storage and retrieval system for relevant information from previous interactions. However, augmenting LLMs with conventional neural memory usually leads to difficulties in storing, retrieving, and manipulating historical information in memory, especially for tasks requiring complex multi-hop reasoning. 

Two primary reasons are that They don’t retain historical data in a structured fashion and don’t manipulate it symbolically since they all rely on vector similarity computations, which might be mistaken and cause an accumulation of mistakes. Researchers from Tsinghua University, Beijing Academy of Artificial Intelligence and Zhejiang University advocate using databases as innovative symbolic memory for LLMs to solve the problems above. ChatDB is the name of the entire framework. Figure 1 below depicts the two parts that makeup ChatDB: an LLM controller and its memory. The read and write operations to the memory are controlled by the LLM controller, which can be any widely used LLM. 

The memory of LLMs, which can be symbolic, non-symbolic, or a hybrid of the two, is in charge of keeping track of the past and disseminating data as needed to help the LLM react to human input. ChatDB emphasizes leveraging databases as symbolic memory, enabling the organized storing of historical data through the execution of figurative language, namely SQL commands. The LLM created these SQL statements. A database can be used as symbolic memory in situations requiring exact recording, updating, querying, deletion, and analysis of historical data. For instance, a store manager has to keep track of daily sales figures. Therefore, utilizing matrices or plain text as memory is inappropriate. 

However, using a database as an external symbolic memory is quite appropriate. The database uses SQL commands to perform precise actions such as data insertion, deletion, update, and selection. As a result, they were using databases as external symbolic memory guarantees correctness and efficiency in managing and manipulating historical data, considerably improving the performance of LLMs in situations that call for very accurate and lengthy data capture and processing. In the ChatDB framework, they suggest the chain-of-memory strategy to more skillfully utilize the external symbolic memory, further boosting LLMs’ capacity for reasoning. 

Figure 1 shows the ChatDB process overall. The read and write operations to the memory are managed by the LLM controller. In order to respond to user input, the memory retains past data and presents pertinent historical data. In ChatDB, we emphasize adding databases to LLMs to serve as their symbolic memory.

User input is converted into a sequence of intermediate memory operation stages via the chain-of-memory technique, which produces the desired outputs. A complex problem is divided into several memory operation stages using the chain-of-memory technique, considerably reducing the problem-solving difficulty. Each intermediary step in ChatDB entails one or more SQL statements. The field of LLMs benefits greatly from their ChatDB. First, they suggest adding databases to LLMs as their external symbolic memory. This would allow for organized archiving of historical data and would enable symbolic and complicated data manipulations using SQL statements. 

Second, they can effectively manipulate memory by transforming user input into multiple-step intermediate memory operations using their chain-of-memory technique. This improves ChatDB’s efficiency and allows it to manage complicated, multi-table database transactions with more precision and stability. Finally, their research shows that adding symbolic memory to LLMs enhances multi-hop reasoning skills and reduces error accumulation, allowing ChatDB to perform better on a synthetic dataset than ChatGPT.

Check Out The Paper and Project. Don’t forget to join our 23k+ ML SubRedditDiscord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more. If you have any questions regarding the above article or if we missed anything, feel free to email us at Asif@marktechpost.com

Check Out 100’s AI Tools in AI Tools Club

The post Meet ChatDB: A Framework that Augments LLMs with Symbolic Memory in the Form of Databases appeared first on MarkTechPost.

 Large language models, like GPT-4 and PaLM 2, have evolved into a crucial part of contemporary AI systems, revolutionizing their grasp of natural language processing and changing various sectors. Despite great advancements in comprehension and producing contextually appropriate replies, LLMs still have certain drawbacks. The fact that multi-turn interactions with language models make a lot
The post Meet ChatDB: A Framework that Augments LLMs with Symbolic Memory in the Form of Databases appeared first on MarkTechPost.  Read More AI Shorts, Applications, Artificial Intelligence, Editors Pick, Language Model, Large Language Model, Machine Learning, Staff, Tech News, Technology, Uncategorized 

Leave a Reply

Your email address will not be published. Required fields are marked *