Skip to content

Commit bb7dfa6

Browse files
authored
Merge pull request #3 from stanfordnlp/main
Merge from main
2 parents 1c1e9ac + 41d0f0e commit bb7dfa6

File tree

26 files changed

+1495
-106
lines changed

26 files changed

+1495
-106
lines changed
Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
---
2+
sidebar_position: 5
3+
---
4+
5+
# dsp.PremAI
6+
7+
[PremAI](https://app.premai.io) is an all-in-one platform that simplifies the process of creating robust, production-ready applications powered by Generative AI. By streamlining the development process, PremAI allows you to concentrate on enhancing user experience and driving overall growth for your application.
8+
9+
### Prerequisites
10+
11+
Refer to the [quick start](https://docs.premai.io/introduction) guide to getting started with the PremAI platform, create your first project and grab your API key.
12+
13+
### Usage
14+
15+
Please make sure you have premai python sdk installed. Otherwise you can do it using this command:
16+
17+
```bash
18+
pip install -U premai
19+
```
20+
21+
Here is a quick example on how to use premai python sdk with dspy
22+
23+
```python
24+
from dspy import PremAI
25+
26+
llm = PremAI(model='mistral-tiny', project_id=123, api_key="your-premai-api-key")
27+
print(llm("what is a large language model"))
28+
```
29+
30+
> Please note: Project ID 123 is just an example. You can find your project ID inside our platform under which you created your project.
31+
32+
### Constructor
33+
34+
The constructor initializes the base class `LM` and verifies the `api_key` provided or defined through the `PREMAI_API_KEY` environment variable.
35+
36+
```python
37+
class PremAI(LM):
38+
def __init__(
39+
self,
40+
model: str,
41+
project_id: int,
42+
api_key: str,
43+
base_url: Optional[str] = None,
44+
session_id: Optional[int] = None,
45+
**kwargs,
46+
) -> None:
47+
```
48+
49+
**Parameters:**
50+
51+
- `model` (_str_): Models supported by PremAI. Example: `mistral-tiny`. We recommend using the model selected in [project launchpad](https://docs.premai.io/get-started/launchpad).
52+
- `project_id` (_int_): The [project id](https://docs.premai.io/get-started/projects) which contains the model of choice.
53+
- `api_key` (_Optional[str]_, _optional_): API provider from PremAI. Defaults to None.
54+
- `session_id` (_Optional[int]_, _optional_): The ID of the session to use. It helps to track the chat history.
55+
- `**kwargs`: Additional language model arguments will be passed to the API provider.
56+
57+
### Methods
58+
59+
#### `__call__(self, prompt: str, **kwargs) -> List[Dict[str, Any]]`
60+
61+
Retrieves completions from PremAI by calling `request`.
62+
63+
Internally, the method handles the specifics of preparing the request prompt and corresponding payload to obtain the response.
64+
65+
After generation, the completions are post-processed based on the `model_type` parameter.
66+
67+
**Parameters:**
68+
69+
- `prompt` (_str_): Prompt to send to PremAI.
70+
- `**kwargs`: Additional keyword arguments for completion request. Example: parameters like `temperature`, `max_tokens` etc. You can find all the additional kwargs [here](https://docs.premai.io/get-started/sdk#optional-parameters).

docs/api/language_model_clients/Snowflake.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,3 @@
1-
---
2-
sidebar_position:
3-
---
4-
51
# dspy.Snowflake
62

73
### Usage

docs/api/retrieval_model_clients/SnowflakeRM.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,3 @@
1-
---
2-
sidebar_position:
3-
---
4-
51
# retrieve.SnowflakeRM
62

73
### Constructor

docs/docs/building-blocks/1-language_models.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -137,6 +137,7 @@ lm = dspy.{provider_listed_below}(model="your model", model_request_kwargs="..."
137137

138138
4. `dspy.Together` for hosted various open source models.
139139

140+
5. `dspy.PremAI` for hosted best open source and closed source models.
140141

141142
### Local LMs.
142143

@@ -173,4 +174,4 @@ model = 'dist/prebuilt/mlc-chat-Llama-2-7b-chat-hf-q4f16_1'
173174
model_path = 'dist/prebuilt/lib/Llama-2-7b-chat-hf-q4f16_1-cuda.so'
174175

175176
llama = dspy.ChatModuleClient(model=model, model_path=model_path)
176-
```
177+
```

docs/docs/cheatsheet.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -254,15 +254,15 @@ evaluate_program(your_dspy_program)
254254

255255
## DSPy Optimizers
256256

257-
### dspy.LabeledFewShot
257+
### LabeledFewShot
258258
```python
259259
from dspy.teleprompt import LabeledFewShot
260260

261-
labeled_fewshot_optimizer = dspy.LabeledFewShot(k=8)
261+
labeled_fewshot_optimizer = LabeledFewShot(k=8)
262262
your_dspy_program_compiled = labeled_fewshot_optimizer.compile(student = your_dspy_program, trainset=trainset)
263263
```
264264

265-
### dspy.BootstrapFewShot
265+
### BootstrapFewShot
266266
```python
267267
from dspy.teleprompt import BootstrapFewShot
268268

@@ -302,7 +302,7 @@ loaded_program = YourProgramClass()
302302
loaded_program.load(path=save_path)
303303
```
304304

305-
### dspy.BootstrapFewShotWithRandomSearch
305+
### BootstrapFewShotWithRandomSearch
306306

307307
```python
308308
from dspy.teleprompt import BootstrapFewShotWithRandomSearch
@@ -312,10 +312,10 @@ fewshot_optimizer = BootstrapFewShotWithRandomSearch(metric=your_defined_metric,
312312
your_dspy_program_compiled = fewshot_optimizer.compile(student = your_dspy_program, trainset=trainset, valset=devset)
313313

314314
```
315-
Other custom configurations are similar to customizing the `dspy.BootstrapFewShot` optimizer.
315+
Other custom configurations are similar to customizing the `BootstrapFewShot` optimizer.
316316

317317

318-
### dspy.Ensemble
318+
### Ensemble
319319

320320
```python
321321
from dspy.teleprompt import BootstrapFewShotWithRandomSearch
@@ -329,7 +329,7 @@ programs = [x[-1] for x in your_dspy_program_compiled.candidate_programs]
329329
your_dspy_program_compiled_ensemble = ensemble_optimizer.compile(programs[:3])
330330
```
331331

332-
### dspy.BootstrapFinetune
332+
### BootstrapFinetune
333333

334334
```python
335335
from dspy.teleprompt import BootstrapFewShotWithRandomSearch, BootstrapFinetune
@@ -356,7 +356,7 @@ for p in finetune_program.predictors():
356356
p.activated = False
357357
```
358358

359-
### dspy.COPRO
359+
### COPRO
360360

361361
```python
362362
from dspy.teleprompt import COPRO
@@ -368,7 +368,7 @@ copro_teleprompter = COPRO(prompt_model=model_to_generate_prompts, task_model=mo
368368
compiled_program_optimized_signature = copro_teleprompter.compile(your_dspy_program, trainset=trainset, eval_kwargs=eval_kwargs)
369369
```
370370

371-
### dspy.MIPRO
371+
### MIPRO
372372

373373

374374
```python
@@ -395,7 +395,7 @@ compiled_program = optimize_signature(
395395
).program
396396
```
397397

398-
### dspy.KNNFewShot
398+
### KNNFewShot
399399

400400
```python
401401
from dspy.predict import KNN
@@ -406,7 +406,7 @@ knn_optimizer = KNNFewShot(KNN, k=3, trainset=trainset)
406406
your_dspy_program_compiled = knn_optimizer.compile(student=your_dspy_program, trainset=trainset, valset=devset)
407407
```
408408

409-
### dspy.BootstrapFewShotWithOptuna
409+
### BootstrapFewShotWithOptuna
410410

411411
```python
412412
from dspy.teleprompt import BootstrapFewShotWithOptuna
Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
## PremAI
2+
3+
[PremAI](https://app.premai.io) is an all-in-one platform that simplifies the process of creating robust, production-ready applications powered by Generative AI. By streamlining the development process, PremAI allows you to concentrate on enhancing user experience and driving overall growth for your application.
4+
5+
### Prerequisites
6+
7+
Refer to the [quick start](https://docs.premai.io/introduction) guide to getting started with the PremAI platform, create your first project and grab your API key.
8+
9+
### Setting up the PremAI Client
10+
11+
The constructor initializes the base class `LM` to support prompting requests to supported PremAI hosted models. This requires the following parameters:
12+
13+
- `model` (_str_): Models supported by PremAI. Example: `mistral-tiny`. We recommend using the model selected in [project launchpad](https://docs.premai.io/get-started/launchpad).
14+
- `project_id` (_int_): The [project id](https://docs.premai.io/get-started/projects) which contains the model of choice.
15+
- `api_key` (_Optional[str]_, _optional_): API provider from PremAI. Defaults to None.
16+
- `session_id` (_Optional[int]_, _optional_): The ID of the session to use. It helps to track the chat history.
17+
- `**kwargs`: Additional language model arguments will be passed to the API provider.
18+
19+
Example of PremAI constructor:
20+
21+
```python
22+
class PremAI(LM):
23+
def __init__(
24+
self,
25+
model: str,
26+
project_id: int,
27+
api_key: str,
28+
base_url: Optional[str] = None,
29+
session_id: Optional[int] = None,
30+
**kwargs,
31+
) -> None:
32+
```
33+
34+
### Under the Hood
35+
36+
#### `__call__(self, prompt: str, **kwargs) -> str`
37+
38+
**Parameters:**
39+
- `prompt` (_str_): Prompt to send to PremAI.
40+
- `**kwargs`: Additional keyword arguments for completion request.
41+
42+
**Returns:**
43+
- `str`: Completions string from the chosen LLM provider
44+
45+
Internally, the method handles the specifics of preparing the request prompt and corresponding payload to obtain the response.
46+
47+
### Using the PremAI client
48+
49+
```python
50+
premai_client = dspy.PremAI(project_id=1111)
51+
```
52+
53+
Please note that, this is a dummy `project_id`. You need to change this to the project_id you are interested to use with dspy.
54+
55+
```python
56+
dspy.configure(lm=premai_client)
57+
58+
#Example DSPy CoT QA program
59+
qa = dspy.ChainOfThought('question -> answer')
60+
61+
response = qa(question="What is the capital of Paris?")
62+
print(response.answer)
63+
```
64+
65+
2) Generate responses using the client directly.
66+
67+
```python
68+
response = premai_client(prompt='What is the capital of Paris?')
69+
print(response)
70+
```

docs/docs/quick-start/installation.mdx

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -30,37 +30,37 @@ import TabItem from '@theme/TabItem';
3030
</TabItem>
3131
<TabItem value="pinecone" label="Pinecone">
3232
```text
33-
pip install dspy-ai[pinecone]
33+
pip install "dspy-ai[pinecone]"
3434
```
3535
</TabItem>
3636
<TabItem value="qdrant" label="Qdrant">
3737
```text
38-
pip install dspy-ai[qdrant]
38+
pip install "dspy-ai[qdrant]"
3939
```
4040
</TabItem>
4141
<TabItem value="chromadb" label="ChromaDB">
4242
```text
43-
pip install dspy-ai[chromadb]
43+
pip install "dspy-ai[chromadb]"
4444
```
4545
</TabItem>
4646
<TabItem value="marqo" label="Marqo">
4747
```text
48-
pip install dspy-ai[marqo]
48+
pip install "dspy-ai[marqo]"
4949
```
5050
</TabItem>
5151
<TabItem value="mongodb" label="MongoDB">
5252
```text
53-
pip install dspy-ai[mongodb]
53+
pip install "dspy-ai[mongodb]"
5454
```
5555
</TabItem>
5656
<TabItem value="weaviate" label="Weaviate">
5757
```text
58-
pip install dspy-ai[weaviate]
58+
pip install "dspy-ai[weaviate]"
5959
```
6060
</TabItem>
6161
<TabItem value="milvus" label="Milvus">
6262
```text
63-
pip install dspy-ai[milvus]
63+
pip install "dspy-ai[milvus]"
6464
```
6565
</TabItem>
6666

docs/docusaurus.config.ts

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -155,6 +155,23 @@ const config: Config = {
155155
}),
156156
],
157157
],
158+
scripts: [
159+
{
160+
id: "runllm-widget-script",
161+
type: "module",
162+
src: "https://cdn.jsdelivr.net/npm/@runllm/search-widget@stable/dist/run-llm-search-widget.es.js",
163+
"runllm-server-address": "https://api.runllm.com",
164+
"runllm-assistant-id": "132",
165+
"runllm-position": "BOTTOM_RIGHT",
166+
"runllm-keyboard-shortcut": "Mod+j",
167+
version: "stable",
168+
"runllm-preset": "docusaurus",
169+
"runllm-slack-community-url": "",
170+
"runllm-name": "DSPy",
171+
"runllm-theme-color": "#005EEC",
172+
async: true,
173+
},
174+
],
158175
};
159176

160177
export default config;

dsp/modules/__init__.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,7 @@
2020
from .hf_client import Anyscale, HFClientTGI, Together
2121
from .mistral import *
2222
from .ollama import *
23+
from .premai import PremAI
2324
from .pyserini import *
2425
from .sbert import *
2526
from .sentence_vectorizer import *

dsp/modules/cohere.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ def __init__(
5858
Additional arguments to pass to the API provider.
5959
"""
6060
super().__init__(model)
61-
self.co = cohere.Client(api_key)
61+
self.co = cohere.Client(api_key, client_name='dspy')
6262
self.provider = "cohere"
6363
self.kwargs = {
6464
"model": model,

0 commit comments

Comments
 (0)