Q: Hi, When followup questions are sent to a GPT, it seems the new tokens used are added to the previous tokens used.

So a short followup question in a chat results cost the tokens from the previous messages + the tokens for the short followup question.  Is that not a bug?

JDDJun 28, 2024
Founder Team
Arturo_Straico

Arturo_Straico

Jun 28, 2024

A: Hi JDD,

Thank you for your question. By default, in a conversation, every new prompt takes into account the previous context, which does result in increased coin consumption if you keep the same conversation going.

You can go to the settings section and adjust the word cap limit to set a limit for the previous context, which can help manage your token usage more effectively.

If you need further assistance, please feel free to write to us at hello@straico.com.

Best regards,
Arturo

Helpful?
Log in to join the conversation
Related questions
View product details