Skip to content

Commit 805ed48

Browse files
authored
Merge branch 'main' into claude_3
2 parents f8c5683 + edadecd commit 805ed48

File tree

75 files changed

+9158
-1242
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

75 files changed

+9158
-1242
lines changed

.github/workflows/run_tests.yml

Lines changed: 17 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,29 +1,38 @@
1-
name: Fix, Test, and Build
1+
name: Lint, Test, and Build
22

33
on:
44
push:
55
branches:
66
- main
77
pull_request:
8+
types: [opened, synchronize, reopened]
89

910
env:
1011
POETRY_VERSION: "1.6.1"
1112

1213
jobs:
1314
fix:
14-
name: Apply Ruff Fix
15+
name: Check Ruff Fix
1516
runs-on: ubuntu-latest
1617
permissions:
1718
contents: write
19+
pull-requests: write
1820
steps:
19-
- uses: actions/checkout@v3
21+
- uses: actions/checkout@v4
2022
- uses: actions/setup-python@v5
21-
- uses: chartboost/ruff-action@v1
22-
with:
23-
args: --fix-only
24-
- uses: stefanzweifel/git-auto-commit-action@v5
23+
- name: Ruff Fix Attempt
24+
id: ruff_fix
25+
uses: chartboost/ruff-action@v1
2526
with:
26-
commit_message: "Automatic Style fixes"
27+
args: --fix-only --exit-non-zero-on-fix
28+
continue-on-error: true
29+
30+
- name: Fail Workflow if Ruff Fix Failed
31+
if: steps.ruff_fix.outcome == 'failure'
32+
run: |
33+
echo "Ruff fix failed, failing the workflow."
34+
echo "Please run 'ruff check . --fix-only' locally and push the changes."
35+
exit 1
2736
2837
test:
2938
name: Run Tests
@@ -33,8 +42,6 @@ jobs:
3342
python-version: ["3.9"]
3443
steps:
3544
- uses: actions/checkout@v4
36-
with:
37-
ref: ${{ github.head_ref }}
3845
- name: Load cached Poetry installation
3946
id: cached-poetry
4047
uses: actions/cache@v3
@@ -66,8 +73,6 @@ jobs:
6673
python-version: ["3.9"]
6774
steps:
6875
- uses: actions/checkout@v4
69-
with:
70-
ref: ${{ github.head_ref }}
7176
- name: Load cached Poetry installation
7277
id: cached-poetry
7378
uses: actions/cache@v3
@@ -99,8 +104,6 @@ jobs:
99104
python-version: ["3.9"]
100105
steps:
101106
- uses: actions/checkout@v4
102-
with:
103-
ref: ${{ github.head_ref }}
104107
- name: Load cached Poetry installation
105108
id: cached-poetry
106109
uses: actions/cache@v3

.pre-commit-config.yaml

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -5,15 +5,15 @@ default_stages: [commit]
55
default_install_hook_types: [pre-commit, commit-msg]
66

77
repos:
8-
- repo: https://github.com/astral-sh/ruff-pre-commit
9-
# Ruff version.
10-
rev: v0.1.11
11-
hooks:
12-
# Run the linter.
13-
- id: ruff
14-
args: [--fix]
15-
# Run the formatter.
16-
- id: ruff-format
8+
# - repo: https://github.com/astral-sh/ruff-pre-commit
9+
# # Ruff version.
10+
# rev: v0.1.11
11+
# hooks:
12+
# # Run the linter.
13+
# - id: ruff
14+
# args: [--fix]
15+
# # Run the formatter.
16+
# - id: ruff-format
1717

1818
- repo: https://github.com/timothycrosley/isort
1919
rev: 5.12.0
@@ -50,14 +50,14 @@ repos:
5050
args:
5151
- "--autofix"
5252
- "--indent=2"
53-
- repo: local
54-
hooks:
55-
- id: validate-commit-msg
56-
name: Commit Message is Valid
57-
language: pygrep
58-
entry: ^(break|build|ci|docs|feat|fix|perf|refactor|style|test|ops|hotfix|release|maint|init|enh|revert)\([\w,\.,\-,\(,\),\/]+\)(!?)(:)\s{1}([\w,\W,:]+)
59-
stages: [commit-msg]
60-
args: [--negate]
53+
# - repo: local
54+
# hooks:
55+
# - id: validate-commit-msg
56+
# name: Commit Message is Valid
57+
# language: pygrep
58+
# entry: ^(break|build|ci|docs|feat|fix|perf|refactor|style|test|ops|hotfix|release|maint|init|enh|revert)\([\w,\.,\-,\(,\),\/]+\)(!?)(:)\s{1}([\w,\W,:]+)
59+
# stages: [commit-msg]
60+
# args: [--negate]
6161

6262
- repo: https://github.com/pre-commit/mirrors-prettier
6363
rev: v3.0.3
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
{
2+
"label": "Functional",
3+
"position": 2,
4+
"link": {
5+
"type": "generated-index",
6+
"description": "This documentation provides an overview of the Typed Predictors."
7+
}
8+
}
Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
---
2+
sidebar_position: 2
3+
---
4+
5+
# dspy.TypedChainOfThought
6+
7+
### Overview
8+
9+
#### `def TypedChainOfThought(signature, max_retries=3) -> dspy.Module`
10+
11+
Adds a Chain of Thoughts `dspy.OutputField` to the `dspy.TypedPredictor` module by prepending it to the Signature. Similar to `dspy.TypedPredictor` but automatically adds a "reasoning" output field.
12+
13+
* **Inputs**:
14+
* `signature`: The `dspy.Signature` specifying the input/output fields
15+
* `max_retries`: Maximum number of retries if outputs fail validation
16+
* **Output**: A dspy.Module instance capable of making predictions.
17+
18+
### Example
19+
20+
```python
21+
from dspy import InputField, OutputField, Signature
22+
from dspy.functional import TypedChainOfThought
23+
from pydantic import BaseModel
24+
25+
# We define a pydantic type that automatically checks if it's argument is valid python code.
26+
class CodeOutput(BaseModel):
27+
code: str
28+
api_reference: str
29+
30+
class CodeSignature(Signature):
31+
function_description: str = InputField()
32+
solution: CodeOutput = OutputField()
33+
34+
cot_predictor = TypedChainOfThought(CodeSignature)
35+
prediction = cot_predictor(
36+
function_description="Write a function that adds two numbers."
37+
)
38+
39+
print(prediction["code"])
40+
print(prediction["api_reference"])
41+
```
Lines changed: 78 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,78 @@
1+
---
2+
sidebar_position: 1
3+
---
4+
5+
# dspy.TypedPredictor
6+
7+
The `TypedPredictor` class is a sophisticated module designed for making predictions with strict type validations. It leverages a signature to enforce type constraints on inputs and outputs, ensuring that the data follows to the expected schema.
8+
9+
### Constructor
10+
11+
```python
12+
TypedPredictor(
13+
CodeSignature
14+
max_retries=3
15+
)
16+
```
17+
18+
Parameters:
19+
* `signature` (dspy.Signature): The signature that defines the input and output fields along with their types.
20+
* `max_retries` (int, optional): The maximum number of retries for generating a valid prediction output. Defaults to 3.
21+
22+
### Methods
23+
24+
#### `copy() -> "TypedPredictor"`
25+
26+
Creates and returns a deep copy of the current TypedPredictor instance.
27+
28+
**Returns:** A new instance of TypedPredictor that is a deep copy of the original instance.
29+
30+
#### `_make_example(type_: Type) -> str`
31+
32+
A static method that generates a JSON object example based pn the schema of the specified Pydantic model type. This JSON object serves as an example for the expected input or output format.
33+
34+
**Parameters:**
35+
* `type_`: A Pydantic model class for which an example JSON object is to be generated.
36+
37+
**Returns:** A string that represents a JSON object example, which validates against the provided Pydantic model's JSON schema. If the method is unable to generate a valid example, it returns an empty string.
38+
39+
#### `_prepare_signature() -> dspy.Signature`
40+
41+
Prepares and returns a modified version of the signature associated with the TypedPredictor instance. This method iterates over the signature's fields to add format and parser functions based on their type annotations.
42+
43+
**Returns:** A dspy.Signature object that has been enhanced with formatting and parsing specifications for its fields.
44+
45+
#### `forward(**kwargs) -> dspy.Prediction`
46+
47+
Executes the prediction logic, making use of the `dspy.Predict` component to generate predictions based on the input arguments. This method handles type validation, parsing of output data, and implements retry logic in case the output does not initially follow to the specified output schema.
48+
49+
**Parameters:**
50+
51+
* `**kwargs`: Keyword arguments corresponding to the input fields defined in the signature.
52+
53+
**Returns:** A dspy.Prediction object containing the prediction results. Each key in this object corresponds to an output field defined in the signature, and its value is the parsed result of the prediction.
54+
55+
### Example
56+
57+
```python
58+
from dspy import InputField, OutputField, Signature
59+
from dspy.functional import TypedPredictor
60+
from pydantic import BaseModel
61+
62+
# We define a pydantic type that automatically checks if it's argument is valid python code.
63+
class CodeOutput(BaseModel):
64+
code: str
65+
api_reference: str
66+
67+
class CodeSignature(Signature):
68+
function_description: str = InputField()
69+
solution: CodeOutput = OutputField()
70+
71+
cot_predictor = TypedPredictor(CodeSignature)
72+
prediction = cot_predictor(
73+
function_description="Write a function that adds two numbers."
74+
)
75+
76+
print(prediction["code"])
77+
print(prediction["api_reference"])
78+
```

docs/api/functional/dspy_cot.md

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
---
2+
sidebar_position: 4
3+
---
4+
5+
# dspy.cot
6+
7+
### Overview
8+
9+
#### `def cot(func) -> dspy.Module`
10+
11+
The `@cot` decorator is used to create a Chain of Thoughts module based on the provided function. It automatically generates a `dspy.TypedPredictor` and from the function's type annotations and docstring. Similar to predictor, but adds a "Reasoning" output field to capture the model's step-by-step thinking.
12+
13+
* **Input**: Function with input parameters and return type annotation.
14+
* **Output**: A dspy.Module instance capable of making predictions.
15+
16+
### Example
17+
18+
```python
19+
import dspy
20+
21+
context = ["Roses are red.", "Violets are blue"]
22+
question = "What color are roses?"
23+
24+
@dspy.cot
25+
def generate_answer(self, context: list[str], question) -> str:
26+
"""Answer questions with short factoid answers."""
27+
pass
28+
29+
generate_answer(context=context, question=question)
30+
```
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
---
2+
sidebar_position: 3
3+
---
4+
5+
# dspy.predictor
6+
7+
### Overview
8+
9+
#### `def predictor(func) -> dspy.Module`
10+
11+
The `@predictor` decorator is used to create a predictor module based on the provided function. It automatically generates a `dspy.TypedPredictor` and from the function's type annotations and docstring.
12+
13+
* **Input**: Function with input parameters and return type annotation.
14+
* **Output**: A dspy.Module instance capable of making predictions.
15+
16+
### Example
17+
18+
```python
19+
import dspy
20+
21+
context = ["Roses are red.", "Violets are blue"]
22+
question = "What color are roses?"
23+
24+
@dspy.predictor
25+
def generate_answer(self, context: list[str], question) -> str:
26+
"""Answer questions with short factoid answers."""
27+
pass
28+
29+
generate_answer(context=context, question=question)
30+
```

docs/api/language_model_clients/_category_.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"label": "Language Model API Clients",
3-
"position": 4,
3+
"position": 5,
44
"link": {
55
"type": "generated-index",
66
"description": "This documentation provides an overview of the DSPy Language Model Clients."
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
# dspy.HFModel
2+
3+
Initialize `HFModel` within your program with the desired model to load in. Here's an example call:
4+
5+
```python
6+
llama = dspy.HFModel(model = 'meta-llama/Llama-2-7b-hf')
7+
```
Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
# dspy.ChatModuleClient
2+
3+
## Prerequisites
4+
5+
1. Install the required packages using the following commands:
6+
7+
```shell
8+
pip install --no-deps --pre --force-reinstall mlc-ai-nightly-cu118 mlc-chat-nightly-cu118 -f https://mlc.ai/wheels
9+
pip install transformers
10+
git lfs install
11+
```
12+
13+
Adjust the pip wheels according to your OS/platform by referring to the provided commands in [MLC packages](https://mlc.ai/package/).
14+
15+
## Running MLC Llama-2 models
16+
17+
1. Create a directory for prebuilt models:
18+
19+
```shell
20+
mkdir -p dist/prebuilt
21+
```
22+
23+
2. Clone the necessary libraries from the repository:
24+
25+
```shell
26+
git clone https://github.com/mlc-ai/binary-mlc-llm-libs.git dist/prebuilt/lib
27+
cd dist/prebuilt
28+
```
29+
30+
3. Choose a Llama-2 model from [MLC LLMs](https://huggingface.co/mlc-ai) and clone the model repository:
31+
32+
```shell
33+
git clone https://huggingface.co/mlc-ai/mlc-chat-Llama-2-7b-chat-hf-q4f16_1
34+
```
35+
36+
4. Initialize the `ChatModuleClient` within your program with the desired parameters. Here's an example call:
37+
38+
```python
39+
llama = dspy.ChatModuleClient(model='dist/prebuilt/mlc-chat-Llama-2-7b-chat-hf-q4f16_1', model_path='dist/prebuilt/lib/Llama-2-7b-chat-hf-q4f16_1-cuda.so')
40+
```
41+
Please refer to the [official MLC repository](https://github.com/mlc-ai/mlc-llm) for more detailed information and [documentation](https://mlc.ai/mlc-llm/docs/get_started/try_out.html).

0 commit comments

Comments
 (0)