Founder Team
Yanghuang_remio
Mar 24, 2026A: Yes, it already works. You can use Ollama or any local provider by selecting "Custom Provider" in the settings.
We default to cloud models for higher reasoning quality, but the option for a fully local, private setup is available and ready for you to configure.
Share
Helpful?
0
Log in to join the conversation
ahmedfarrag17PLUS
Curious to know!
Founder
We already support this feature, you can try it now!