A Primer to using LlamaIndexο
At its core, LlamaIndex contains a toolkit designed to easily connect LLMβs with your external data. LlamaIndex helps to provide the following:
A set of data structures that allow you to index your data for various LLM tasks, and remove concerns over prompt size limitations.
Data connectors to your common data sources (Google Docs, Slack, etc.).
Cost transparency + tools that reduce cost while increasing performance.
Each data structure offers distinct use cases and a variety of customizable parameters. These indices can then be queried in a general purpose manner, in order to achieve any task that you would typically achieve with an LLM:
Question-Answering
Summarization
Text Generation (Stories, TODOβs, emails, etc.)
and more!
The guides below are intended to help you get the most out of LlamaIndex. It gives a high-level overview of the following:
The general usage pattern of LlamaIndex.
Mapping Use Cases to LlamaIndex data Structures
How Each Index Works
General Guides