gypsyjazz52gypsyjazz52
gypsyjazz52
Jun 17, 2023

Q: Hello, during the operation of Voila, I came across some points that it would be very desirable to fix in the future: 1.

In the original ChatGPT, you can maintain communication in one window indefinitely (in the sense of messages, there can be an unlimited number, I'm not talking about the amount of context that ChatGPT can hold - but about the fact that you can send him at least 1000 messages and the dialogue will not be interrupted, even though he will remember only the last N messages of them that are able to hold in memory). In Voila, I constantly encounter the message "Your request is too long. Please try to use less words." (and this is when using version 4).

I would like this problem to be solved.

2. In the original ChatGPT, you can turn on any of the dialogs and continue communicating with the bot from any message from the selected dialog.
And in Voila:
1) It is impossible to continue the dialogs, because after exiting the dialog, it automatically goes into history and is no longer available to continue.
2) It is impossible to get multiple responses to the same request without deleting the previous one (so that, just like in the original ChatGPT, it would be possible to make a fork to continue the dialogue).

P.S. - as far as I know, ChatGPT has now increased the volume of the context by 4 times and reduced the cost of generation.
Will this increase the volume of words in the package (was 300,000 became 350,000-400,000?) and will the increase in context be available in Voila? On the one hand, you work through the API and all the changes made by them are automatically applied to you... but.. on the other hand, you never know if you have your own settings to improve your work, which can somehow limit the volume, as I wrote above, in Voila I constantly encounter the message "Your request is too long. Please try to use less words." - is it not just that this error appears?

For earlier thanks for the answers and for a great product.

Founder Team
Michal_Voilá

Michal_Voilá

May 15, 2024

A: Hello there,

thank you for leaving your feedback, I will keep it in mind when prioritizing the next features for Voilà.

Some of the capabilities or features are already present in our feature requests at https://voila.canny.io

Regarding your question about the recently announced changes from OpenAI – we already integrated the 16k GPT-3 model and we're using it to process longer conversations.

The price change that was announced is not related to GPT-4, therefore, we can not do anything about the existing FUP for GPT-4 at the moment.

Thank you for your understanding.

Best,
Michal

Share
Helpful?
Log in to join the conversation