Founder Team
Karina_AIssistify
May 15, 2024A: Hello, we understand that the token system may initially seem less straightforward than a per-word system.
The reason we use tokens is because our system is built on OpenAI's GPT models, which process text in chunks called tokens. When you ask AI to generate text, it consumes a certain number of tokens.
On average, 1000 tokens are approximately equivalent to 750 words.
To help you keep track of token usage, we display the number of remaining tokens on the left pane in the AIssistify account interface.
Share
Helpful?
Log in to join the conversation