addition1addition1
addition1PLUS
Dec 22, 2025

Q: local model-new to Ai

Can you please explain local model as they start to be available in tier 3. Also, saw you explain in a question about the LLM's and such as being limited no matter what plan you have? In order to get the full capabilities of a model or LLM we need to have a direct account with them?

Founder Team
Felipe_TriploAI

Felipe_TriploAI

Dec 22, 2025

A: Hey there! Welcome! Let’s break answers down in a chill way.

How does Allowance works on Triplo AI?
Allowance models in Triplo AI give you access to a monthly quota of 2 million tokens per device. You can use these tokens across more than 40 different models. This means you can tap into a variety of AI capabilities without worrying about complicated credit systems. Each token costs the same, no matter which model you’re using. It’s all about keeping things simple and straightforward. If you want to see the full list of models available, check out this link: https://go.triplo.ai/allowance

Connecting Triplo AI to LLMs
You have some cool options for connecting Triplo AI to various LLMs:

Bring Your Own Key (all tiers): This allows you to connect your own API keys from providers like OpenAI, Anthropic, or OpenRouter. You can use premium models like Claude or GPT-5. Just follow the instructions here: https://go.triplo.ai/allowance

OpenAI-Compatible Endpoints (all tiers): You can integrate external AI service providers that support the OpenAI API standard. This means you can expand your AI capabilities beyond the built-in models. For more details, check out this page: https://documentation.triplo.ai/using-triplo-ai/connections-openai-compatible-endpoints

Local LLMs (PRO Feature): If you want to run models directly on your machine, you can do that too and even use them on your Triplo AI Mobile! This feature is available for users on Tier 3 and above. To set it up, you can refer to the Ollama Setup Guide here: https://documentation.triplo.ai/using-triplo-ai/ollama-setup-guide-for-triplo-ai

So, whether you want to use the allowance models, connect your own API keys, or run local LLMs, you’ve got plenty of options to enhance your experience with Triplo AI. If you have any more questions or need further help, just let me know!

Take care
Felipe

Ps. I answered you by prompting on Triplo AI on my Mobile: "//context// answer the user"

Context is your question and I also have an Instruction (Casual) and a Mind (triplo_support) active. Minor adjustments and ✅.

Share
Helpful?
Log in to join the conversation

Thank you Felipe, it does feel machine induced. Detailed but not as personal as expected. I think this answer is adequate for more seasoned Ai users but still don't fully understand "local".
Local LLMs (PRO Feature): If you want to run models directly on your machine, you can do that too and even use them on your Triplo AI Mobile! This feature is available for users on Tier 3 and above.
this

Thats on me. AI as is today will always require supervision and on mobile, wanting to answer fast, maybe I missed something.

To learn more about the features worth checking the links I shared. The documentation is complete and will probably help you better understanding it.

Take care
Felipe

this means no internet connection? or how does this pro feature work? Sorry for the elementary questions. We just want to make sure .
Thank you again!

All good. It means that if you're using Local LLM models your prompts and results wont be shared with any LLM Vendor (or Triplo AI), they're local. Still, you'll need internet to validate your access to Tripli AI (this is the way we know you have a valid license to use it or not).

Thank you sir!

Related questions
View product details