Open In Colab

Fireworks#

If you’re opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

%pip install llama-index-llms-fireworks
%pip install llama-index
Requirement already satisfied: llama-index in /home/bennyfirebase/conda/lib/python3.9/site-packages (0.10.6)
Requirement already satisfied: llama-index-embeddings-openai<0.2.0,>=0.1.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index) (0.1.5)
Requirement already satisfied: llama-index-multi-modal-llms-openai<0.2.0,>=0.1.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index) (0.1.3)
Requirement already satisfied: llama-index-readers-file<0.2.0,>=0.1.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index) (0.1.4)
Requirement already satisfied: llama-index-llms-openai<0.2.0,>=0.1.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index) (0.1.5)
Requirement already satisfied: llama-index-question-gen-openai<0.2.0,>=0.1.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index) (0.1.2)
Requirement already satisfied: llama-index-core<0.11.0,>=0.10.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index) (0.10.9)
Requirement already satisfied: llama-index-program-openai<0.2.0,>=0.1.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index) (0.1.3)
Requirement already satisfied: llama-index-legacy<0.10.0,>=0.9.48 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index) (0.9.48)
Requirement already satisfied: llama-index-agent-openai<0.2.0,>=0.1.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index) (0.1.4)
Requirement already satisfied: nest-asyncio<2.0.0,>=1.5.8 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (1.6.0)
Requirement already satisfied: httpx in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (0.26.0)
Requirement already satisfied: tqdm<5.0.0,>=4.66.1 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (4.66.2)
Requirement already satisfied: numpy in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (1.26.4)
Requirement already satisfied: pillow>=9.0.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (10.2.0)
Requirement already satisfied: dataclasses-json in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (0.6.4)
Requirement already satisfied: tenacity<9.0.0,>=8.2.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (8.2.3)
Requirement already satisfied: networkx>=3.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (3.2.1)
Requirement already satisfied: aiohttp<4.0.0,>=3.8.6 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (3.9.3)
Requirement already satisfied: PyYAML>=6.0.1 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (6.0.1)
Requirement already satisfied: SQLAlchemy[asyncio]>=1.4.49 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (2.0.27)
Requirement already satisfied: tiktoken>=0.3.3 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (0.6.0)
Requirement already satisfied: openai>=1.1.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (1.12.0)
Requirement already satisfied: pandas in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (2.2.0)
Requirement already satisfied: deprecated>=1.2.9.3 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (1.2.14)
Requirement already satisfied: dirtyjson<2.0.0,>=1.0.8 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (1.0.8)
Requirement already satisfied: fsspec>=2023.5.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (2024.2.0)
Requirement already satisfied: nltk<4.0.0,>=3.8.1 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (3.8.1)
Requirement already satisfied: typing-inspect>=0.8.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (0.9.0)
Requirement already satisfied: llamaindex-py-client<0.2.0,>=0.1.13 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (0.1.13)
Requirement already satisfied: requests>=2.31.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (2.31.0)
Requirement already satisfied: typing-extensions>=4.5.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-core<0.11.0,>=0.10.0->llama-index) (4.9.0)
Requirement already satisfied: bs4<0.0.3,>=0.0.2 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-readers-file<0.2.0,>=0.1.0->llama-index) (0.0.2)
Requirement already satisfied: pymupdf<2.0.0,>=1.23.21 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-readers-file<0.2.0,>=0.1.0->llama-index) (1.23.25)
Requirement already satisfied: beautifulsoup4<5.0.0,>=4.12.3 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-readers-file<0.2.0,>=0.1.0->llama-index) (4.12.3)
Requirement already satisfied: pypdf<5.0.0,>=4.0.1 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llama-index-readers-file<0.2.0,>=0.1.0->llama-index) (4.0.2)
Requirement already satisfied: async-timeout<5.0,>=4.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.0->llama-index) (4.0.3)
Requirement already satisfied: multidict<7.0,>=4.5 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.0->llama-index) (6.0.5)
Requirement already satisfied: aiosignal>=1.1.2 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.3.1)
Requirement already satisfied: attrs>=17.3.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.0->llama-index) (23.2.0)
Requirement already satisfied: yarl<2.0,>=1.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.9.4)
Requirement already satisfied: frozenlist>=1.1.1 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.4.1)
Requirement already satisfied: soupsieve>1.2 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from beautifulsoup4<5.0.0,>=4.12.3->llama-index-readers-file<0.2.0,>=0.1.0->llama-index) (2.5)
Requirement already satisfied: wrapt<2,>=1.10 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from deprecated>=1.2.9.3->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.16.0)
Requirement already satisfied: pydantic>=1.10 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from llamaindex-py-client<0.2.0,>=0.1.13->llama-index-core<0.11.0,>=0.10.0->llama-index) (2.6.1)
Requirement already satisfied: anyio in /home/bennyfirebase/conda/lib/python3.9/site-packages (from httpx->llama-index-core<0.11.0,>=0.10.0->llama-index) (4.3.0)
Requirement already satisfied: httpcore==1.* in /home/bennyfirebase/conda/lib/python3.9/site-packages (from httpx->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.0.3)
Requirement already satisfied: sniffio in /home/bennyfirebase/conda/lib/python3.9/site-packages (from httpx->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.3.0)
Requirement already satisfied: idna in /home/bennyfirebase/conda/lib/python3.9/site-packages (from httpx->llama-index-core<0.11.0,>=0.10.0->llama-index) (3.6)
Requirement already satisfied: certifi in /home/bennyfirebase/conda/lib/python3.9/site-packages (from httpx->llama-index-core<0.11.0,>=0.10.0->llama-index) (2024.2.2)
Requirement already satisfied: h11<0.15,>=0.13 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from httpcore==1.*->httpx->llama-index-core<0.11.0,>=0.10.0->llama-index) (0.14.0)
Requirement already satisfied: joblib in /home/bennyfirebase/conda/lib/python3.9/site-packages (from nltk<4.0.0,>=3.8.1->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.3.2)
Requirement already satisfied: regex>=2021.8.3 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from nltk<4.0.0,>=3.8.1->llama-index-core<0.11.0,>=0.10.0->llama-index) (2023.12.25)
Requirement already satisfied: click in /home/bennyfirebase/conda/lib/python3.9/site-packages (from nltk<4.0.0,>=3.8.1->llama-index-core<0.11.0,>=0.10.0->llama-index) (8.1.7)
Requirement already satisfied: distro<2,>=1.7.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from openai>=1.1.0->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.9.0)
Requirement already satisfied: PyMuPDFb==1.23.22 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from pymupdf<2.0.0,>=1.23.21->llama-index-readers-file<0.2.0,>=0.1.0->llama-index) (1.23.22)
Requirement already satisfied: urllib3<3,>=1.21.1 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from requests>=2.31.0->llama-index-core<0.11.0,>=0.10.0->llama-index) (2.2.1)
Requirement already satisfied: charset-normalizer<4,>=2 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from requests>=2.31.0->llama-index-core<0.11.0,>=0.10.0->llama-index) (3.3.2)
Requirement already satisfied: greenlet!=0.4.17 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from SQLAlchemy[asyncio]>=1.4.49->llama-index-core<0.11.0,>=0.10.0->llama-index) (3.0.3)
Requirement already satisfied: mypy-extensions>=0.3.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from typing-inspect>=0.8.0->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.0.0)
Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from dataclasses-json->llama-index-core<0.11.0,>=0.10.0->llama-index) (3.20.2)
Requirement already satisfied: python-dateutil>=2.8.2 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from pandas->llama-index-core<0.11.0,>=0.10.0->llama-index) (2.8.2)
Requirement already satisfied: pytz>=2020.1 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from pandas->llama-index-core<0.11.0,>=0.10.0->llama-index) (2024.1)
Requirement already satisfied: tzdata>=2022.7 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from pandas->llama-index-core<0.11.0,>=0.10.0->llama-index) (2024.1)
Requirement already satisfied: exceptiongroup>=1.0.2 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from anyio->httpx->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.2.0)
Requirement already satisfied: packaging>=17.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from marshmallow<4.0.0,>=3.18.0->dataclasses-json->llama-index-core<0.11.0,>=0.10.0->llama-index) (23.2)
Requirement already satisfied: annotated-types>=0.4.0 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from pydantic>=1.10->llamaindex-py-client<0.2.0,>=0.1.13->llama-index-core<0.11.0,>=0.10.0->llama-index) (0.6.0)
Requirement already satisfied: pydantic-core==2.16.2 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from pydantic>=1.10->llamaindex-py-client<0.2.0,>=0.1.13->llama-index-core<0.11.0,>=0.10.0->llama-index) (2.16.2)
Requirement already satisfied: six>=1.5 in /home/bennyfirebase/conda/lib/python3.9/site-packages (from python-dateutil>=2.8.2->pandas->llama-index-core<0.11.0,>=0.10.0->llama-index) (1.16.0)
Note: you may need to restart the kernel to use updated packages.

Basic Usage#

Call complete with a prompt#

from llama_index.llms.fireworks import Fireworks

resp = Fireworks().complete("Paul Graham is ")
print(resp)
Graham is a well-known essayist, programmer, and startup entrepreneur. He is best known for co-founding the venture capital firm, Y Combinator, which has funded and supported numerous successful startups, including Dropbox, Airbnb, and Reddit. Prior to Y Combinator, Graham co-founded Viaweb, one of the first software-as-a-service (SaaS) companies, which was later acquired by Yahoo.

Graham is also known for his influential essays on startups, programming, and personal productivity, which he publishes on his website, www.paulgraham.com. His writing is widely read and respected in the tech and startup communities, and he has been credited with helping to shape the way many people think about startups and entrepreneurship.

In addition to his work in technology and venture capital, Graham has a strong interest in education and has written extensively about the need for reform in the American education system. He is a graduate of Cornell University and holds a Ph.D. in computer science from Harvard University.

Call chat with a list of messages#

from llama_index.core.llms import ChatMessage
from llama_index.llms.fireworks import Fireworks

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = Fireworks().chat(messages)
print(resp)
assistant: Arr matey, ye be askin' for me name? Well, I be known as Captain Redbeard the Terrible! Me crew says I'm a right colorful character, with me flamboyant red beard and me love for all things bright and bold. So hoist the Jolly Roger and let's set sail for adventure, yarr!

Streaming#

Using stream_complete endpoint

from llama_index.llms.fireworks import Fireworks

llm = Fireworks()
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")
a well-known essayist, programmer, and venture capitalist. He is best known for co-founding the startup incubator and venture capital firm, Y Combinator, which has funded and helped grow many successful tech companies including Dropbox, Airbnb, and Reddit. Graham is also known for his influential essays on startups, technology, and programming, which he publishes on his website. Prior to his work in venture capital, Graham was a successful programmer and entrepreneur, co-founding the company Viaweb (later acquired by Yahoo) which developed the first web-based application for building online stores.

Using stream_chat endpoint

from llama_index.llms.fireworks import Fireworks
from llama_index.core.llms import ChatMessage

llm = Fireworks()
messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = llm.stream_chat(messages)
for r in resp:
    print(r.delta, end="")
Arr matey, ye be askin' for me name? Well, I be known as Captain Redbeard the Terrible! Me crew says I'm a bit of a character, always crackin' me jokes and singin' shanties. But don't let me fool ya, I be a fierce pirate, feared by all who sail the seven seas!

Configure Model#

from llama_index.llms.fireworks import Fireworks

llm = Fireworks(model="accounts/fireworks/models/firefunction-v1")
resp = llm.complete("Paul Graham is ")
print(resp)
Paul Graham is an English-American computer scientist, entrepreneur, venture capitalist, author, and blogger. He is known for co-founding the web-based application platform Viaweb, which was acquired by Yahoo! in 1998. He is also the founder of the startup accelerator Y Combinator.
messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = llm.chat(messages)
print(resp)
assistant: My name is Captain Redbeard, but you can call me Red for short.

Set API Key at a per-instance level#

If desired, you can have separate LLM instances use separate API keys.

from llama_index.llms.fireworks import Fireworks

llm = Fireworks(
    model="accounts/fireworks/models/firefunction-v1", api_key="BAD_KEY"
)
resp = Fireworks().complete("Paul Graham is ")
print(resp)
a well-known essayist, programmer, and venture capitalist. He is best known for co-founding the startup incubator and venture capital firm, Y Combinator, which has funded and helped grow many successful tech companies including Dropbox, Airbnb, and Reddit. Graham is also known for his influential essays on startups, technology, and programming, which he publishes on his website, www.paulgraham.com. Prior to his work in venture capital, Graham was a successful entrepreneur and programmer, co-founding the company Viaweb (later acquired by Yahoo) and writing the programming language Arc.