Skip to main content

Mistral AI

How to Integrate Mistral AI with Jan

Mistral AI provides two ways to use their Large Language Models (LLM):

  1. API
  2. Open-source models on Hugging Face.

To integrate Jan with Mistral AI, follow the steps below:

note

This tutorial demonstrates integrating Mistral AI with Jan using the API.

Step 1: Configure Mistral API Key

  1. Obtain Mistral API keys from your Mistral dashboard.
  2. Insert the Mistral AI API key into ~/jan/engines/openai.json.
~/jan/engines/openai.json
{
"full_url": "https://api.mistral.ai/v1/chat/completions",
"api_key": "<your-mistral-ai-api-key>"
}

Step 2: Model Configuration

  1. Navigate to ~/jan/models.
  2. Create a folder named mistral-(modelname) (e.g., mistral-tiny).
  3. Inside, create a model.json file with these settings:
    • Set id to the Mistral AI model ID.
    • Set format to api.
    • Set engine to openai.
    • Set state to ready.
~/jan/models/mistral-tiny/model.json
{
"sources": [
{
"filename": "mistral-tiny",
"url": "https://mistral.ai/"
}
],
"id": "mistral-tiny",
"object": "model",
"name": "Mistral-7B-v0.2 (Tiny Endpoint)",
"version": "1.0",
"description": "Currently powered by Mistral-7B-v0.2, a better fine-tuning of the initial Mistral-7B released, inspired by the fantastic work of the community.",
"format": "api",
"settings": {},
"parameters": {},
"metadata": {
"author": "Mistral AI",
"tags": ["General", "Big Context Length"]
},
"engine": "openai"
}

note
  • For more details regarding the model.json settings and parameters fields, please see here.
  • Mistral AI offers various endpoints. Refer to their endpoint documentation to select the one that fits your requirements. Here, we use the mistral-tiny model as an example.

Step 3: Start the Model

  1. Restart Jan and navigate to the Hub.
  2. Locate your model and click the Use button.