Skip to main content

Open Interpreter

How to Integrate Open Interpreter with Jan

Open Interpreter lets LLMs run code (Python, Javascript, Shell, and more) locally. You can chat with Open Interpreter through a ChatGPT-like interface in your terminal by running interpreter after installing. To integrate Open Interpreter with Jan, follow the steps below:

Step 1: Install Open Interpreter

  1. Install Open Interpreter by running:
pip install open-interpreter
  1. A Rust compiler is required to install Open Interpreter. If not already installed, run the following command or go to this page if you are running on windows:
sudo apt install rustc

Step 2: Configure Jan's Local API Server

Before using Open Interpreter, configure the model in Settings > My Model for Jan and activate its local API server.

Enabling Jan API Server

  1. Click the <> button to access the Local API Server section in Jan.

  2. Configure the server settings, including IP Port, Cross-Origin-Resource-Sharing (CORS), and Verbose Server Logs.

  3. Click Start Server.

Step 3: Set the Open Interpreter Environment

  1. For integration, provide the API Base (http://localhost:1337/v1) and the model ID (e.g., mistral-ins-7b-q4) when running Open Interpreter. For example see the code below:
interpreter --api_base http://localhost:1337/v1 --model mistral-ins-7b-q4

Open Interpreter is now ready for use!