Q: It’s nice to have features and addons but I really need to know how well these bots embed, recal, and learn.
I’m working with large amounts of information, possibly as much as 100-500mb of data my bots will need to be trained on so what I really need to know is:
1. How do you implement conversational memory for your chatbot? What type of memory do you use and how do you manage the token limits of the underlying model?
2. How do you measure the knowledge retention rate, knowledge recall rate, and knowledge fidelity rate of your chatbot? Can you provide some examples or benchmarks of how your chatbot performs on these metrics?
3. How do you handle file uploads for your chatbot? What formats and sizes do you support and how do you parse and embed the data from the files?
4. How do you ensure the security and privacy of the data that you upload and embed in your chatbot? How do you protect the data from unauthorized access or misuse?
5. How do you update and maintain your chatbot with the latest data and information? How often do you refresh the data and how do you notify the users of any changes?
ChatofAI
May 15, 2024A: Thank you for your feedback. Here are our responses to the questions:
1. How do you implement conversational memory for your chatbot? What type of memory do you use and how do you manage the token limits of the underlying model?
Regarding conversational memory, we believe it's important for the chatbot to understand the previous few rounds of conversation to maintain a smooth conversation with customers.
We use Pinecone for vector storage to enable similarity search, but we don't store users' actual data.
2. How do you measure the knowledge retention rate, knowledge recall rate, and knowledge fidelity rate of your chatbot? Can you provide some examples or benchmarks of how your chatbot performs on these metrics?
We haven't tested specific data for knowledge retention rate, knowledge recall rate, and knowledge fidelity rate, as these metrics heavily rely on well-known embeddings and similarity retrieval algorithms.
Our primary focus is on extracting readable and accurate data from various files and web pages to ensure the AI's responses are precise.
3. How do you handle file uploads for your chatbot? What formats and sizes do you support and how do you parse and embed the data from the files?
Currently, we support file uploads in formats such as docx, pdf (Scanned PDF), markdown, txt, and web page links. The maximum upload size for a single file is limited to 5MB.
4. How do you ensure the security and privacy of the data that you upload and embed in your chatbot? How do you protect the data from unauthorized access or misuse?
For detailed information on data security and privacy, please refer to our website: https://chatof.ai/privacy
5. How do you update and maintain your chatbot with the latest data and information? How often do you refresh the data and how do you notify the users of any changes?
Updating the chatbot with the latest data and information currently requires manual updates by uploading files or re-crawling web pages.
We don't have an automated system in place for data refreshment. We also don't have a notification mechanism for informing users about changes.