MarianZMarianZ
MarianZPLUS
Mar 23, 2026

Q: will it work with an local AI?

or is this feature planned?

Founder Team
Yanghuang_remio

Yanghuang_remio

Mar 24, 2026

A: Yes, it already works. You can use Ollama or any local provider by selecting "Custom Provider" in the settings.
We default to cloud models for higher reasoning quality, but the option for a fully local, private setup is available and ready for you to configure.

Share
Helpful?
0
Log in to join the conversation

Curious to know!

We already support this feature, you can try it now!