Q: More LLM choice
Hi there and thanks for bringing this Gem to AS!
Offering more variety in the available LLMs would give users greater flexibility and control over how summaries are generated. While you currently support GPT-4o and Claude which are decent ones, expanding the range of powerful yet cost effective models — such as DeepSeek V3, Qwen Turbo, Mistral, Gemini, or Meta's Llama — would provide a richer experience and allow for more tailored results depending on user preferences.
I have updated to T3 hoping to hear positive outcome on this or any sort of OpenRouter Integration in the near future. This will democratize our summaries with the LLM we are more conformable with (you still can recommend GPT-4o or Claude as default user selection).
Thanks again!
Best,

Tony_MymemAI
Jul 4, 2025A: Hi RudyCa,
Thank you so much for the thoughtful suggestion and for upgrading to T3! 💎
We really appreciate your feedback — expanding model choices is definitely something we’re actively exploring. Supporting more LLMs like DeepSeek, Qwen, Mistral, Gemini, or LLaMA would allow us to better match user preferences and use cases, and we totally agree this could enrich the experience for everyone.
Thanks again for being part of our journey — and stay tuned! 🧠✨
Warmly,
The MyMemo Team