|
| 1 | +# Mistral AI |
| 2 | + |
| 3 | +[Mistral AI](https://mistral.ai/) is a research lab building the best open source models in the world. |
| 4 | + |
| 5 | +Mistral AI offers both premier models and free models, driving innovation and convenience for the developer community. Mistral AI models are state-of-the-art for their multilingual, code generation, maths, and advanced reasoning capabilities. |
| 6 | + |
| 7 | +## Installation |
| 8 | + |
| 9 | +Mistral API is configured as an optional dependency in Strands Agents. To install, run: |
| 10 | + |
| 11 | +```bash |
| 12 | +pip install 'strands-agents[mistral]' |
| 13 | +``` |
| 14 | + |
| 15 | +## Usage |
| 16 | + |
| 17 | +After installing `mistral`, you can import and initialize Strands Agents' Mistral API provider as follows: |
| 18 | + |
| 19 | +```python |
| 20 | +from strands import Agent |
| 21 | +from strands.models.mistral import MistralModel |
| 22 | +from strands_tools import calculator |
| 23 | + |
| 24 | +model = MistralModel( |
| 25 | + api_key="<YOUR_MISTRAL_API_KEY>", |
| 26 | + # **model_config |
| 27 | + model_id="mistral-large-latest", |
| 28 | +) |
| 29 | + |
| 30 | +agent = Agent(model=model, tools=[calculator]) |
| 31 | +response = agent("What is 2+2") |
| 32 | +print(response) |
| 33 | +``` |
| 34 | + |
| 35 | +## Configuration |
| 36 | + |
| 37 | +### Client Configuration |
| 38 | + |
| 39 | +The `client_args` configure the underlying Mistral client. You can pass additional arguments to customize the client behavior: |
| 40 | + |
| 41 | +```python |
| 42 | +model = MistralModel( |
| 43 | + api_key="<YOUR_MISTRAL_API_KEY>", |
| 44 | + client_args={ |
| 45 | + "timeout": 30, |
| 46 | + # Additional client configuration options |
| 47 | + }, |
| 48 | + model_id="mistral-large-latest" |
| 49 | +) |
| 50 | +``` |
| 51 | + |
| 52 | +For a complete list of available client arguments, please refer to the Mistral AI [documentation](https://docs.mistral.ai/). |
| 53 | + |
| 54 | +### Model Configuration |
| 55 | + |
| 56 | +The `model_config` configures the underlying model selected for inference. The supported configurations are: |
| 57 | + |
| 58 | +| Parameter | Description | Example | Options | |
| 59 | +|-----------|-------------|---------|---------| |
| 60 | +| `model_id` | ID of a Mistral model to use | `mistral-large-latest` | [reference](https://docs.mistral.ai/getting-started/models/) | |
| 61 | +| `max_tokens` | Maximum number of tokens to generate in the response | `1000` | Positive integer | |
| 62 | +| `temperature` | Controls randomness in generation (0.0 to 1.0) | `0.7` | Float between 0.0 and 1.0 | |
| 63 | +| `top_p` | Controls diversity via nucleus sampling | `0.9` | Float between 0.0 and 1.0 | |
| 64 | +| `stream` | Whether to enable streaming responses | `true` | `true` or `false` | |
| 65 | + |
| 66 | +## Environment Variables |
| 67 | + |
| 68 | +You can set your Mistral API key as an environment variable instead of passing it directly: |
| 69 | + |
| 70 | +```bash |
| 71 | +export MISTRAL_API_KEY="your_api_key_here" |
| 72 | +``` |
| 73 | + |
| 74 | +Then initialize the model without the API key parameter: |
| 75 | + |
| 76 | +```python |
| 77 | +model = MistralModel(model_id="mistral-large-latest") |
| 78 | +``` |
| 79 | + |
| 80 | +## Troubleshooting |
| 81 | + |
| 82 | +### Module Not Found |
| 83 | + |
| 84 | +If you encounter the error `ModuleNotFoundError: No module named 'mistralai'`, this means you haven't installed the `mistral` dependency in your environment. To fix, run `pip install 'strands-agents[mistral]'`. |
| 85 | + |
| 86 | +## References |
| 87 | + |
| 88 | +- [API Reference](../../../api-reference/models.md) |
| 89 | +- [Mistral AI Documentation](https://docs.mistral.ai/) |
0 commit comments