Skip to content

Commit 5e4df24

Browse files
committed
Update description of KNN and Instruction Optimizers.
- Add key ideas from Arnav's KNN explanation (in the issue) and - Clarify that only MIPRO optimizes the demonstration set.
1 parent dc0d78b commit 5e4df24

File tree

1 file changed

+2
-3
lines changed

1 file changed

+2
-3
lines changed

docs/docs/building-blocks/6-optimizers.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -53,13 +53,12 @@ These optimizers extend the signature by automatically generating and including
5353

5454
4. **`BootstrapFewShotWithOptuna`**: Applies `BootstrapFewShot` through Optuna hyperparameter optimization across demonstration sets, running trials to maximize evaluation metrics and selecting the best demonstrations.
5555

56-
5. **`KNNFewShot`**. Selects demonstrations through k-Nearest Neighbors algorithm. Vectorizes the examples, and then clusters them, using cluster centers with `BootstrapFewShot` for bootstrapping/selection process.
57-
56+
5. **`KNNFewShot`**. Selects demonstrations through k-Nearest Neighbors algorithm to pick a diverse set of examples from different clusters. Vectorizes the examples, and then clusters them, using cluster centers with `BootstrapFewShot` for bootstrapping/selection process. This will be useful when there's a lot of data over random spaces: using KNN helps optimize the `trainset` for `BootstrapFewShot`. See [this notebook](https://github.com/stanfordnlp/dspy/blob/main/examples/knn.ipynb) for an example.
5857

5958

6059
#### Automatic Instruction Optimization
6160

62-
These optimizers serve to produce optimal instructions for the prompt, in addition to optimized few-shot demonstrations.
61+
These optimizers produce optimal instructions for the prompt and, in the case of MIPRO also optimize the set of few-shot demonstrations.
6362

6463
6. **`COPRO`**: Generates and refines new instructions for each step, and optimizes them with coordinate ascent (hill-climbing using the metric function and the `trainset`). Parameters include `depth` which is the number of iterations of prompt improvement the optimizer runs over.
6564

0 commit comments

Comments
 (0)