Q: Further questions from experiments with the trial
3) I have configured my bot to collect user data at the beginning of the conversation, but it is not doing that either at beginning or at any time during conversation.
4) I have set up the fallback scenario as : "The agent should tell the user that it doesn't know the answer and that someone will get back to the user from support team on the email they have already given or if they have not given it, then ask them for the same." -- but when I asked it a question that I knew it couldn't answer , it just replied with " I am sorry, but I am unable to answer this question." - no mention of getting back by email or asking for email
5) Even with other BYOK models (2.5 gemini flash in this case), it answers some questions and then says the same thing that it has influx of requests and to try after sometime 🙄
Akshay_ChatbotBuilder
Apr 28, 2026A: Hi there,
Thank you for taking the time to share such detailed feedback.
I want to reassure you that the issues you experienced were already identified on our end and have now been resolved. The root cause was related to instability with the LLM provider we were using at the time, which led to the “high influx of requests” message and inconsistent behavior across both default credits and some BYOK models. This has since been fixed, and the system is now stable.
Regarding the model versions you mentioned, you’re absolutely right to point that out. We’ve already updated and aligned our supported models to reflect the latest stable versions and are working on removing the deprecated ones. The issues around bot not collecting user data at the start and fallback flow not triggering as configured, were also part of the same underlying problem affecting response handling, and these have been addressed as part of the fix.
I’d really appreciate it if you could give the product another try, specifically by creating a fresh bot and testing the flows again. It should now behave as expected.
If you run into anything or have questions, feel free to reach out to me directly at [email protected], and I’ll personally make sure it’s looked into.
Thanks again for your patience and for pointing these out so clearly.
Regards,
Akshay
Okay, I will experiment with it further and get back to you.
One more thing I wanted to understand: when I am using your AI keys, in a specific case when I am using Gemini 2.5 flash, I tried to check the credit limits after every interaction. What I am seeing is that, every time a reply is generated by the bot, the number of credits used is uneven. (contd...)
Sometimes it is 3-4 credits per reply, and at one time it even went up to 7-8 credits for just one reply.
I need to understand how the credits are consumed per reply. With the amount of credits that you are giving us, and even with limits on BYOK, if one reply can consume 3 to 10 credits, then the number of messages we can have drastically reduces, isn't it?