Documentation Index
Fetch the complete documentation index at: https://01.openinterpreter.com/llms.txt
Use this file to discover all available pages before exploring further.
llamafile
llamafile lets you distribute and run LLMs with a single file. Read more about llamafile here
# Set the LLM service to llamafile
poetry run 01 --llm-service llamafile
Llamaedge
llamaedge makes it easy for you to run LLM inference apps and create OpenAI-compatible API services for the Llama2 series of LLMs locally.
Read more about Llamaedge here
# Set the LLM service to Llamaedge
poetry run 01 --llm-service llamaedge
Hosted Models
01OS leverages liteLLM which supports many hosted models.
To select your providers
# Set the LLM service
poetry run 01 --llm-service openai
Other Models
More instructions coming soon!