You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/building-blocks/2-modules.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ A **DSPy module** is a building block for programs that use LMs.
13
13
- Multiple modules can be composed into bigger modules (programs). DSPy modules are inspired directly by NN modules in PyTorch, but applied to LM programs.
14
14
15
15
16
-
###How do I use a built-in module, like `dspy.Predict` or `dspy.ChainOfThought`?
16
+
## How do I use a built-in module, like `dspy.Predict` or `dspy.ChainOfThought`?
17
17
18
18
Let's start with the most fundamental module, `dspy.Predict`. Internally, all other DSPy modules are just built using `dspy.Predict`.
Copy file name to clipboardExpand all lines: docs/building-blocks/3-language_models.md
+9-31Lines changed: 9 additions & 31 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,38 +4,20 @@ sidebar_position: 4
4
4
5
5
# Language Models
6
6
7
-
This guide assumes you followed the [intro tutorial](https://colab.research.google.com/github/stanfordnlp/dspy/blob/main/intro.ipynb) to build your first few DSPy programs.
8
7
9
-
Remember that a **DSPy program** is just Python code that calls one or more DSPy modules, like `dspy.Predict` or `dspy.ChainOfThought`, to use LMs.
8
+
## Remote LMs.
10
9
11
-
## 1) Short Intro to LMs in DSPy {#1-short-intro-to-lms-in-dspy}
These models are managed services. You just need to sign up and obtain
27
-
an API key.
10
+
These models are managed services. You just need to sign up and obtain an API key.
28
11
29
12
1.`dspy.OpenAI` for GPT-3.5 and GPT-4.
30
13
31
14
2.`dspy.Cohere`
32
15
33
16
3.`dspy.Anyscale` for hosted Llama2 models.
34
17
35
-
### Local LMs. {#local-lms}
18
+
### Local LMs.
36
19
37
-
You need to host these models on your own GPU(s). Below, we include
38
-
pointers for how to do that.
20
+
You need to host these models on your own GPU(s). Below, we include pointers for how to do that.
39
21
40
22
1.`dspy.HFClientTGI`: for HuggingFace models through the Text Generation Inference (TGI) system. [Tutorial: How do I install and launch the TGI server?](/api/hosting_language_models_locally/TGI)
41
23
@@ -49,16 +31,14 @@ pointers for how to do that.
49
31
50
32
If there are other clients you want added, let us know!
51
33
52
-
## 3) Setting up the LM client. {#3-setting-up-the-lm-client}
34
+
## Setting up the LM client.
53
35
54
36
You can just call the constructor that connects to the LM. Then, use
55
37
`dspy.configure` to declare this as the default LM.
56
38
57
39
For example, for OpenAI, you can do it as follows.
0 commit comments