Skip to content

Commit 40b3f49

Browse files
authored
Merge pull request #836 from tom-doerr/patch-3
Fix variable names in 1-language_models.md
2 parents 46df3a0 + ec650ea commit 40b3f49

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

docs/docs/building-blocks/1-language_models.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -145,25 +145,25 @@ You need to host these models on your own GPU(s). Below, we include pointers for
145145
1. `dspy.HFClientTGI`: for HuggingFace models through the Text Generation Inference (TGI) system. [Tutorial: How do I install and launch the TGI server?](https://dspy-docs.vercel.app/docs/deep-dive/language_model_clients/local_models/HFClientTGI)
146146

147147
```python
148-
tgi_llama2 = dspy.HFClientTGI(model="mistralai/Mistral-7B-Instruct-v0.2", port=8080, url="http://localhost")
148+
tgi_mistral = dspy.HFClientTGI(model="mistralai/Mistral-7B-Instruct-v0.2", port=8080, url="http://localhost")
149149
```
150150

151151
2. `dspy.HFClientVLLM`: for HuggingFace models through vLLM. [Tutorial: How do I install and launch the vLLM server?](https://dspy-docs.vercel.app/docs/deep-dive/language_model_clients/local_models/HFClientVLLM)
152152

153153
```python
154-
vllm_llama2 = dspy.HFClientVLLM(model="mistralai/Mistral-7B-Instruct-v0.2", port=8080, url="http://localhost")
154+
vllm_mistral = dspy.HFClientVLLM(model="mistralai/Mistral-7B-Instruct-v0.2", port=8080, url="http://localhost")
155155
```
156156

157157
3. `dspy.HFModel` (experimental) [Tutorial: How do I initialize models using HFModel](https://dspy-docs.vercel.app/api/local_language_model_clients/HFModel)
158158

159159
```python
160-
llama = dspy.HFModel(model = 'mistralai/Mistral-7B-Instruct-v0.2')
160+
mistral = dspy.HFModel(model = 'mistralai/Mistral-7B-Instruct-v0.2')
161161
```
162162

163163
4. `dspy.Ollama` (experimental) for open source models through [Ollama](https://ollama.com). [Tutorial: How do I install and use Ollama on a local computer?](https://dspy-docs.vercel.app/api/local_language_model_clients/Ollama)\n",
164164

165165
```python
166-
mistral_ollama = dspy.OllamaLocal(model='mistral')
166+
ollama_mistral = dspy.OllamaLocal(model='mistral')
167167
```
168168

169169
5. `dspy.ChatModuleClient` (experimental): [How do I install and use MLC?](https://dspy-docs.vercel.app/api/local_language_model_clients/MLC)

0 commit comments

Comments
 (0)