Q: LiteLLM Proxy/Router and OpenAI API Compatible BYOK
It looks like Triplo only supports direct connections to Ollama, and the OpenAI API support is hard-coded to use the paid OpenAI servers.
I use LiteLLM as a high-performance LLM Proxy and Router. It is fully OpenAI API compatible and sits in front of all my local and cloud-based LLMs. It brokers secure and authenticated connections and dynamically routes between them all.
I was under the impression that I could bring my own OpenAI-compatible API Keys and use a custom URL for OpenAI. Is this not the case?
It would also be ideal if the Local LLM connection could
1. Be made directly from Mobile, instead of sharing it
2. Could use a bearer token, or other authentication mechanisms
Felipe_TriploAI
Nov 27, 2025A: Morning Ryan!
We're about to release a feature that will allow users to connect OpenAI compatible services with Triplo AI. It's not live yet so, expected not to work.
Take care
Felipe
See here 👉 https://appsumo.com/products/triplo-ai/questions/please-add-straico-integration-next-to-o-1448396/
Generic OpenAI-Compatible API Key integration is on progress and should be available soon among the next update releases