M.KhPLUS
Mar 24, 2026
Q: Local AI model usage (Qwen3-coder-next )
Can i use my Qwen3-coder-next AI model locally installed via Ollama on my PC in Robomotion's AI Assistant for creating workflow. I tried with Openrouter but it costing me more tokens.
Founder Team
Faik_Robomotion
Mar 24, 2026A: Hi,
This is not currently supported, but it’s a great idea.
We do have Ollama integration within flows, but not yet in the AI Assistant.
Since this is a relatively new feature, we can definitely consider adding support for it as well. We are also getting requests to add Straico support, so we can add both.
Both noted and added to our todo.
Best,
Share
Helpful?
2