Skip to content

LLM source engine which utilizes LangChain to search documents trained locally

Notifications You must be signed in to change notification settings

straslerj/llm-embeddings-search-engine

Repository files navigation

LLM Embeddings Search Engine

LLM source engine which utilizes LangChain to search documents trained locally

Description

Large language models (LLMs) such as ChatGPT have changed how people interact with artificial intelligence. Before, people likely were not interacting with artificial intelligence models in the direct, often conversational format that they are today. This use of artificial intelligence has identified a use case for businesses: getting questions about your company answered without involving a human. This frees up human labor capital to focus on more productive tasks, allows users to get a quicker response, and ideally results in more accurate results.

Slowing companies from implementing this type of interface is the time and cost expenses involved in training an LLM. However, smaller LLMs can be trained in a more focused fashion on a small collection of documents. In the case of this project, two LLMs are trained: one on the course syllabus and one on the National Football League rulebook. A user can then ask questions to the LLM specific to the domain of the document it has been trained on, providing a convenient way to get information quickly.

For more information, please view our whitepaper.

About

LLM source engine which utilizes LangChain to search documents trained locally

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published