Skip to content

Commit 6e165c3

Browse files
Update README.md
1 parent 0224957 commit 6e165c3

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
PHP BPE Text Encoder for GPT-2 / GPT-3
33

44
## About
5-
GPT-2 and GPT-3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a javascript implementation of OpenAI's original python encoder/decoder which can be found [here](https://github.com/openai/gpt-2)
5+
GPT-2 and GPT-3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a PHP implementation of OpenAI's original python encoder which can be found [here](https://github.com/openai/gpt-2), the main source of inspiration for writing this encoder was the NodeJS version of this encoder, found [here](https://github.com/latitudegames/GPT-3-Encoder.
66

77
This specific encoder is used in one of my [WordPress plugins](https://coderevolution.ro), to count the number of tokens a string will use when sent to OpenAI API.
88

@@ -21,4 +21,6 @@ $token_array = gpt_encode($prompt);
2121
```
2222

2323

24+
## TODO
2425

26+
Create also a decoder for the package, currently only an encoder is implemented.

0 commit comments

Comments
 (0)