- Notifications
You must be signed in to change notification settings - Fork18
PHP BPE Text Encoder / Decoder for GPT-2 / GPT-3
License
CodeRevolutionPlugins/GPT-3-Encoder-PHP
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
PHP BPE Text Encoder/Decoder for GPT-2 / GPT-3
GPT-2 and GPT-3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a PHP implementation of OpenAI's original python encoder and decoder which can be foundhere. The main source of inspiration for writing this encoder was the NodeJS version of this encoder, foundhere.
You can test the results, by comparing the output generated by this script, with theofficial tokenizer page from OpenAI.
This specific encoder and decoder is used in theAiomatic WordPress plugin, to count the number of tokens a string will use when sent to OpenAI API. Check more of my work on mywebsite.
The mbstring PHP extension is needed for this tool to work correctly (in case non-ASCII characters are present in the tokenized text):details here on how to install mbstring
$prompt ="Many words map to one token, but some don't: indivisible. Unicode characters like emojis may be split into many tokens containing the underlying bytes: 🤚🏾 Sequences of characters commonly found next to each other may be grouped together: 1234567890";$token_array =gpt_encode($prompt);$original_text =gpt_decode($token_array);
About
PHP BPE Text Encoder / Decoder for GPT-2 / GPT-3
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.