LLM Predictorsο
Init params.
- pydantic model llama_index.llm_predictor.LLMPredictorο
LLM predictor class.
A lightweight wrapper on top of LLMs that handles: - conversion of prompts to the string input format expected by LLMs - logging of prompts and responses to a callback manager
NOTE: Mostly keeping around for legacy reasons. A potential future path is to deprecate this class and move all functionality into the LLM class.
Show JSON schema
{ "title": "LLMPredictor", "description": "LLM predictor class.\n\nA lightweight wrapper on top of LLMs that handles:\n- conversion of prompts to the string input format expected by LLMs\n- logging of prompts and responses to a callback manager\n\nNOTE: Mostly keeping around for legacy reasons. A potential future path is to\ndeprecate this class and move all functionality into the LLM class.", "type": "object", "properties": { "system_prompt": { "title": "System Prompt", "type": "string" }, "query_wrapper_prompt": { "title": "Query Wrapper Prompt" } } }
- Config
arbitrary_types_allowed: bool = True
- Fields
query_wrapper_prompt (Optional[llama_index.prompts.base.BasePromptTemplate])
system_prompt (Optional[str])
- field query_wrapper_prompt: Optional[BasePromptTemplate] = Noneο
- field system_prompt: Optional[str] = Noneο
- async apredict(prompt: BasePromptTemplate, **prompt_args: Any) str ο
Async predict.
- async astream(prompt: BasePromptTemplate, **prompt_args: Any) AsyncGenerator[str, None] ο
Async stream.
- classmethod class_name() str ο
Get class name.
- classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) Model ο
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = βallowβ was set since it adds all passed values
- copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) Model ο
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters
include β fields to include in new model
exclude β fields to exclude from new model, as with values this takes precedence over include
update β values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep β set to True to make a deep copy of the model
- Returns
new model instance
- dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny ο
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
- classmethod from_dict(data: Dict[str, Any], **kwargs: Any) Self ο
- classmethod from_json(data_str: str, **kwargs: Any) Self ο
- classmethod from_orm(obj: Any) Model ο
- json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode ο
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
- classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ο
- classmethod parse_obj(obj: Any) Model ο
- classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ο
- predict(prompt: BasePromptTemplate, **prompt_args: Any) str ο
Predict.
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny ο
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode ο
- stream(prompt: BasePromptTemplate, **prompt_args: Any) Generator[str, None, None] ο
Stream.
- to_dict(**kwargs: Any) Dict[str, Any] ο
- to_json(**kwargs: Any) str ο
- classmethod update_forward_refs(**localns: Any) None ο
Try to update ForwardRefs on fields based on this Model, globalns and localns.
- classmethod validate(value: Any) Model ο
- property callback_manager: CallbackManagerο
Get callback manager.
- property metadata: LLMMetadataο
Get LLM metadata.
- pydantic model llama_index.llm_predictor.StructuredLLMPredictorο
Structured LLM predictor class.
- Parameters
llm_predictor (BaseLLMPredictor) β LLM Predictor to use.
Show JSON schema
{ "title": "StructuredLLMPredictor", "description": "Structured LLM predictor class.\n\nArgs:\n llm_predictor (BaseLLMPredictor): LLM Predictor to use.", "type": "object", "properties": { "system_prompt": { "title": "System Prompt", "type": "string" }, "query_wrapper_prompt": { "title": "Query Wrapper Prompt" } } }
- Config
arbitrary_types_allowed: bool = True
- Fields
query_wrapper_prompt (Optional[llama_index.prompts.base.BasePromptTemplate])
system_prompt (Optional[str])
- field query_wrapper_prompt: Optional[BasePromptTemplate] = Noneο
- field system_prompt: Optional[str] = Noneο
- async apredict(prompt: BasePromptTemplate, **prompt_args: Any) str ο
Async predict the answer to a query.
- Parameters
prompt (BasePromptTemplate) β BasePromptTemplate to use for prediction.
- Returns
Tuple of the predicted answer and the formatted prompt.
- Return type
Tuple[str, str]
- async astream(prompt: BasePromptTemplate, **prompt_args: Any) AsyncGenerator[str, None] ο
Async stream.
- classmethod class_name() str ο
Get class name.
- classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) Model ο
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = βallowβ was set since it adds all passed values
- copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) Model ο
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters
include β fields to include in new model
exclude β fields to exclude from new model, as with values this takes precedence over include
update β values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep β set to True to make a deep copy of the model
- Returns
new model instance
- dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny ο
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
- classmethod from_dict(data: Dict[str, Any], **kwargs: Any) Self ο
- classmethod from_json(data_str: str, **kwargs: Any) Self ο
- classmethod from_orm(obj: Any) Model ο
- json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode ο
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
- classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ο
- classmethod parse_obj(obj: Any) Model ο
- classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ο
- predict(prompt: BasePromptTemplate, **prompt_args: Any) str ο
Predict the answer to a query.
- Parameters
prompt (BasePromptTemplate) β BasePromptTemplate to use for prediction.
- Returns
Tuple of the predicted answer and the formatted prompt.
- Return type
Tuple[str, str]
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny ο
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode ο
- stream(prompt: BasePromptTemplate, **prompt_args: Any) Generator[str, None, None] ο
Stream the answer to a query.
NOTE: this is a beta feature. Will try to build or use better abstractions about response handling.
- Parameters
prompt (BasePromptTemplate) β BasePromptTemplate to use for prediction.
- Returns
The predicted answer.
- Return type
str
- to_dict(**kwargs: Any) Dict[str, Any] ο
- to_json(**kwargs: Any) str ο
- classmethod update_forward_refs(**localns: Any) None ο
Try to update ForwardRefs on fields based on this Model, globalns and localns.
- classmethod validate(value: Any) Model ο
- property callback_manager: CallbackManagerο
Get callback manager.
- property metadata: LLMMetadataο
Get LLM metadata.