Prompt Templatesο
These are the reference prompt templates.
We first show links to default prompts.
We then show the base prompt class, derived from Langchain.
Default Promptsο
The list of default prompts can be found here.
NOTE: weβve also curated a set of refine prompts for ChatGPT use cases. The list of ChatGPT refine prompts can be found here.
Promptsο
Subclasses from base prompt.
- llama_index.prompts.prompts.KeywordExtractPromptο
Query keyword extract prompt.
Prompt to extract keywords from a query query_str with a maximum of max_keywords keywords.
Required template variables: query_str, max_keywords
- llama_index.prompts.prompts.KnowledgeGraphPromptο
Simple Input prompt.
Required template variables: query_str.
- llama_index.prompts.prompts.QueryKeywordExtractPromptο
Schema extract prompt.
Prompt to extract schema from unstructured text text.
Required template variables: text, schema
- llama_index.prompts.prompts.QuestionAnswerPromptο
Keyword extract prompt.
Prompt to extract keywords from a text text with a maximum of max_keywords keywords.
Required template variables: text, max_keywords
- llama_index.prompts.prompts.RefinePromptο
Question Answer prompt.
Prompt to answer a question query_str given a context context_str.
Required template variables: context_str, query_str
- llama_index.prompts.prompts.RefineTableContextPromptο
Define the knowledge graph triplet extraction prompt.
- llama_index.prompts.prompts.SchemaExtractPromptο
Text to SQL prompt.
Prompt to translate a natural language query into SQL in the dialect dialect given a schema schema.
Required template variables: query_str, schema, dialect
- llama_index.prompts.prompts.SimpleInputPromptο
Pandas prompt. Convert query to python code.
Required template variables: query_str, df_str, instruction_str.
- llama_index.prompts.prompts.SummaryPromptο
Tree Insert prompt.
Prompt to insert a new chunk of text new_chunk_text into the tree index. More specifically, this prompt has the LLM select the relevant candidate child node to continue tree traversal.
Required template variables: num_chunks, context_list, new_chunk_text
- llama_index.prompts.prompts.TableContextPromptο
Refine Table context prompt.
Prompt to refine a table context given a table schema schema, as well as unstructured text context context_msg, and a task query_str. This includes both a high-level description of the table as well as a description of each column in the table.
- llama_index.prompts.prompts.TextToSQLPromptο
Table context prompt.
Prompt to generate a table context given a table schema schema, as well as unstructured text context context_str, and a task query_str. This includes both a high-level description of the table as well as a description of each column in the table.
- llama_index.prompts.prompts.TreeInsertPromptο
Tree select prompt.
Prompt to select a candidate child node out of all child nodes provided in context_list, given a query query_str. num_chunks is the number of child nodes in context_list.
Required template variables: num_chunks, context_list, query_str
- llama_index.prompts.prompts.TreeSelectMultiplePromptο
Refine prompt.
Prompt to refine an existing answer existing_answer given a context context_msg, and a query query_str.
Required template variables: query_str, existing_answer, context_msg
- llama_index.prompts.prompts.TreeSelectPromptο
Tree select multiple prompt.
Prompt to select multiple candidate child nodes out of all child nodes provided in context_list, given a query query_str. branching_factor refers to the number of child nodes to select, and num_chunks is the number of child nodes in context_list.
- Required template variables: num_chunks, context_list, query_str,
branching_factor
Base Prompt Classο
Prompt class.
- class llama_index.prompts.Prompt(template: Optional[str] = None, langchain_prompt: Optional[BasePromptTemplate] = None, langchain_prompt_selector: Optional[ConditionalPromptSelector] = None, stop_token: Optional[str] = None, output_parser: Optional[BaseOutputParser] = None, prompt_type: str = PromptType.CUSTOM, metadata: Optional[Dict[str, Any]] = None, **prompt_kwargs: Any)ο
Prompt class for LlamaIndex.
- Wrapper around langchainβs prompt class. Adds ability to:
enforce certain prompt types
partially fill values
define stop token
- format(llm: Optional[BaseLanguageModel] = None, **kwargs: Any) str ο
Format the prompt.
- classmethod from_langchain_prompt(prompt: BasePromptTemplate, **kwargs: Any) Prompt ο
Load prompt from LangChain prompt.
- classmethod from_langchain_prompt_selector(prompt_selector: ConditionalPromptSelector, **kwargs: Any) Prompt ο
Load prompt from LangChain prompt.
- classmethod from_prompt(prompt: Prompt, llm: Optional[BaseLanguageModel] = None, prompt_type: Optional[PromptType] = None) Prompt ο
Create a prompt from an existing prompt.
Use case: If the existing prompt is already partially filled, and the remaining fields satisfy the requirements of the prompt class, then we can create a new prompt from the existing partially filled prompt.
- get_langchain_prompt(llm: Optional[BaseLanguageModel] = None) BasePromptTemplate ο
Get langchain prompt.
- property original_template: strο
Return the originally specified template, if supplied.