RAG Based Chatbot Development

Project Overview

RAG based chatbot is able to answer user queries related to documents uploaded by admin. The client wanted to automate document related queries of users. CoreFragment developed custom chatbot that can support upto 500 documents at a time with 40k tokens storage capacity in memory for context retrival.

Client Region

Europe

Industry

AI and ML

Use Cases:

  • User experience improved when bot clear their doubts irrespective of admin availability
  • Information access becomes faster
  • Need of whole document reading is reduced
  • Frequent user doubts related to particular document can be resolved by bot
RAG-chatbot

Development Insights:

  • Documents are splitted into chunks and their embeddings are stored in vector databases.
  • When user enters query into chatbot, it also splitted into chunks. LLMs generate response based on semantic search on query chunks with context retrieving.
  • The response and query both can be stored in memory upto 40k - 50k tokens.
  • It is used by LLM as context retrieving for further query response.

Technology Platforms

https://api.corefragment.com/public/images/casestudy/12/langchain.webp
https://api.corefragment.com/public/images/casestudy/12/aws.webp
https://api.corefragment.com/public/images/casestudy/12/llamaindex.webp
https://api.corefragment.com/public/images/casestudy/12/pandas.webp
https://api.corefragment.com/public/images/casestudy/12/streamlit.webp
https://api.corefragment.com/public/images/casestudy/12/ollama.webp