Founder Team
Felipe_TriploAI
May 15, 2024A: Hi there!
Tokens are the basic units of text or code that an LLM AI uses to process and generate language. Tokens can be characters, words, subwords, or other segments of text or code, depending on the chosen tokenization method or scheme.
For text in English, 1 token is approximately 4 characters or 0.75 words.
That said. 300.000 tokens are approximately equivalent to 225.000 words used/generated in prompts.
Take care
Share
Helpful?
Log in to join the conversation