Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

PHP BPE Text Encoder / Decoder for GPT-2 / GPT-3

License

NotificationsYou must be signed in to change notification settings

CodeRevolutionPlugins/GPT-3-Encoder-PHP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PHP BPE Text Encoder/Decoder for GPT-2 / GPT-3

About

GPT-2 and GPT-3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a PHP implementation of OpenAI's original python encoder and decoder which can be foundhere. The main source of inspiration for writing this encoder was the NodeJS version of this encoder, foundhere.

You can test the results, by comparing the output generated by this script, with theofficial tokenizer page from OpenAI.

This specific encoder and decoder is used in theAiomatic WordPress plugin, to count the number of tokens a string will use when sent to OpenAI API. Check more of my work on mywebsite.

Usage

The mbstring PHP extension is needed for this tool to work correctly (in case non-ASCII characters are present in the tokenized text):details here on how to install mbstring

$prompt ="Many words map to one token, but some don't: indivisible. Unicode characters like emojis may be split into many tokens containing the underlying bytes: 🤚🏾 Sequences of characters commonly found next to each other may be grouped together: 1234567890";$token_array =gpt_encode($prompt);$original_text =gpt_decode($token_array);

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp