Run Server
Configure
A core part of the 01 server is the interpreter which is an instance of Open Interpreter. Open Interpreter is highly configurable and only requires updating a single file.model, context_window, and many more can be updated here.
LLM service provider
If you wish to use a local model, you can use the--llm-service flag:
Voice Interface
Both speech-to-text and text-to-speech can be configured in 01OS. You are able to pass CLI flags--tts-service and/or --stt-service with the desired service provider to swap out different services
These different service providers can be found in /services/stt and /services/tts
For more information, please read about speech-to-text and text-to-speech
CLI Flags
-
--serverRun server. -
--server-host TEXTSpecify the server host where the server will deploy. Default:0.0.0.0. -
--server-port INTEGERSpecify the server port where the server will deploy. Default:10001. -
--tunnel-service TEXTSpecify the tunnel service. Default:ngrok. -
--exposeExpose server to internet. -
--server-url TEXTSpecify the server URL that the client should expect. Defaults to server-host and server-port. Default:None. -
--llm-service TEXTSpecify the LLM service. Default:litellm. -
--model TEXTSpecify the model. Default:gpt-4. -
--llm-supports-visionSpecify if the LLM service supports vision. -
--llm-supports-functionsSpecify if the LLM service supports functions. -
--context-window INTEGERSpecify the context window size. Default:2048. -
--max-tokens INTEGERSpecify the maximum number of tokens. Default:4096. -
--temperature FLOATSpecify the temperature for generation. Default:0.8. -
--tts-service TEXTSpecify the TTS service. Default:openai. -
--stt-service TEXTSpecify the STT service. Default:openai. -
--localUse recommended local services for LLM, STT, and TTS. -
--install-completion [bash|zsh|fish|powershell|pwsh]Install completion for the specified shell. Default:None. -
--show-completion [bash|zsh|fish|powershell|pwsh]Show completion for the specified shell, to copy it or customize the installation. Default:None. -
--helpShow this message and exit.