joseph231joseph231
joseph231PLUS
Feb 4, 2026

Q: Support for Local LLM's

Will there be support now or in the future for local LLMs?

Founder Team
Victoria_Snoooz

Victoria_Snoooz

Feb 9, 2026

A: Hi Joseph,

At the moment, Snoooz does not support local LLMs, and we don’t have a committed roadmap or timeline for adding it.

Because Snoooz is a cloud-based application, supporting models that run locally (for example on a personal machine or on-prem hardware) is non-trivial. It would require additional infrastructure such as secure connectors, tunneling, or self-hosted deployments to work reliably and safely. That’s a much more complex setup than integrating with cloud AI providers.

For now, Snoooz focuses on supporting managed cloud models (like GPT, Claude, Gemini, etc.), where we can ensure stability, security, and a good user experience.

Hope this helps.

Victoria

Share
Helpful?
0
Log in to join the conversation