Skip to content

Commit e16eeb0

Browse files
authored
docs: Fix broken links
1 parent 63e6f19 commit e16eeb0

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
<img height="400px" src="https://github.com/user-attachments/assets/c78f88f9-bbb7-4bad-91a8-13633ce35d4a" />
2828
</p>
2929

30-
**LLM Connector** is a plugin that provides out-of-the-box integrations with large language models (LLMs). The plugin ships with built-in support for 4 default LLM providers which are [**OpenAI**](docs/providers/OpenAI),[**Gemini**](/docs/providers//Gemini), [**WebLlm (in-browser)**](/docs/providers/WebLlm) and [**Wllama (in-browser)**](docs/providers/Wllama). Developers may also create their own providers beyond those that are provided to support niche or custom use cases. The plugin also provides generalized configurations for managing streaming behavior, chat history inclusion and audio output, greatly simplifying the amount of custom logic required from developers.
30+
**LLM Connector** is a plugin that provides out-of-the-box integrations with large language models (LLMs). The plugin ships with built-in support for 4 default LLM providers which are [**OpenAI**](docs/providers/OpenAI.md), [**Gemini**](/docs/providers/Gemini.md), [**WebLlm (in-browser)**](/docs/providers/WebLlm.md) and [**Wllama (in-browser)**](docs/providers/Wllama.md). Developers may also create their own providers beyond those that are provided to support niche or custom use cases. The plugin also provides generalized configurations for managing streaming behavior, chat history inclusion and audio output, greatly simplifying the amount of custom logic required from developers.
3131

3232
For support, join the plugin community on [**Discord**](https://discord.gg/J6pA4v3AMW) to connect with other developers and get help.
3333

@@ -57,7 +57,7 @@ The plugin is incredibly straightforward to use and is [**available on npm**](ht
5757
};
5858
```
5959

60-
4. Define an `llmConnector` attribute within the [**Block**](https://react-chatbotify.com/docs/concepts/conversations#block) that requires LLM integration. Import your desired LLM provider (or create your own!) and pass it as a value to the `provider` within the `llmConnector` attribute. You may refer to the setup below which uses the [**WebLlmProvider**](/docs/providers/WebLlm) for a better idea (details covered later):
60+
4. Define an `llmConnector` attribute within the [**Block**](https://react-chatbotify.com/docs/concepts/conversations#block) that requires LLM integration. Import your desired LLM provider (or create your own!) and pass it as a value to the `provider` within the `llmConnector` attribute. You may refer to the setup below which uses the [**WebLlmProvider**](/docs/providers/WebLlm.md) for a better idea (details covered later):
6161
```javascript
6262
import ChatBot from "react-chatbotify";
6363
import LlmConnector, { LlmConnectorBlock, WebLlmProvider } from "@rcb-plugins/llm-connector";
@@ -152,10 +152,10 @@ The `llmConnector` attribute is added to the Block that you are keen to integrat
152152

153153
As you may have seen from earlier examples, providers are passed into the `provider` property within the `llmConnector` attribute. Providers are essentially an abstraction over the various LLM providers such as OpenAI and Gemini. With that said, configurations for providers can vary greatly depending on the choice of provider. For the default providers, their configuration guides can be found here:
154154

155-
- [**OpenAIProvider Configurations**](/docs/providers//OpenAI)
156-
- [**GeminiProvider Configurations**](/docs/providers/Gemini)
157-
- [**WebLlmProvider Configurations**](/docs/providers/WebLlm)
158-
- [**WllamaProvider Configurations**](/docs/providers/Wllama)
155+
- [**OpenAIProvider Configurations**](/docs/providers/OpenAI.md)
156+
- [**GeminiProvider Configurations**](/docs/providers/Gemini.md)
157+
- [**WebLlmProvider Configurations**](/docs/providers/WebLlm.md)
158+
- [**WllamaProvider Configurations**](/docs/providers/Wllama.md)
159159

160160
> [!TIP]
161161
> Note that if your choice of provider falls outside the default ones provided but has API specifications aligned to default providers (e.g. OpenAI), you may still use the default providers.

0 commit comments

Comments
 (0)