Chat Engines

Chat engine is a high-level interface for having a conversation with your data (multiple back-and-forth instead of a single question & answer).

Chat Engine Implementations

Below we show specific chat engine implementations.

Chat Engine Types

class llama_index.chat_engine.types.BaseChatEngine

Base Chat Engine.

abstract async achat(message: str) Union[Response, StreamingResponse]

Async version of main chat interface.

abstract chat(message: str) Union[Response, StreamingResponse]

Main chat interface.

chat_repl() None

Enter interactive chat REPL.

abstract reset() None

Reset conversation state.

class llama_index.chat_engine.types.ChatMode(value)

Chat Engine Modes.

CONDENSE_QUESTION = 'condense_question'

Corresponds to CondenseQuestionChatEngine.

First generate a standalone question from conversation context and last message, then query the query engine for a response.

REACT = 'react'

Corresponds to ReActChatEngine.

Use a ReAct agent loop with query engine tools. Implemented via LangChain agent.

SIMPLE = 'simple'

Corresponds to SimpleChatEngine.

Chat with LLM, without making use of a knowledge base.