We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hey, considering its superiority over SPE tokenizers
would you provide some sample/example code to train a tiktoken tokenizer from scratch on a custom dataset
also like training BPE/SPE does it support min_frequency and min_length for tokens while training ?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hey, considering its superiority over SPE tokenizers
would you provide some sample/example code to train a tiktoken tokenizer from scratch on a custom dataset
also like training BPE/SPE does it support min_frequency and min_length for tokens while training ?
The text was updated successfully, but these errors were encountered: