this post was submitted on 01 Dec 2023
1 points (100.0% liked)

LocalLLaMA

8 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 2 years ago
MODERATORS
 

I am planing to use retrieval augmented generation (RAG) based chatbot to look up information from documents (Q&A).

I did try with GPT3.5 and It works pretty well. Now I want to try with Llama (or its variation) on local machine. The idea is to only need to use smaller model (7B or 13B), and provide good enough context information from documents to generate the answer for it.

Have anyone done it before, any comments?

Thanks!

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here