Q: Can I run this on a server on an app?
I have a server where I host websites and I know they have Node.JS applications. I have never used Node.JS and don't know how complex it is but..... My question is if I buy Tier 3 is there some way of hosting a Local LLM provider on my server? My computer isn't powerful enough for this and I don't really know much about any of this but I have a RackNerd server and it has lots of space on it and the panel I use has Node.JS, N8N, Wordpress and probably more,..... I watched Dave Swift's video about how he put Ollama on his computer and wonder if my server can do that on my server? Thanks, sorry for the noob question!
Felipe_TriploAI
Nov 27, 2025A: Hey!
Sure you can access local LLMs running on a server. That said it's a different thing than a hosting server...
Better getting informed with Rack nerd about their services.
Take care
Felipe
It looks like they can, just saw their docs and they have one click Ollama and OpenUI, How large a server would be a minimum size?
I have no clue Lonely. This varies according to the models you want to run etc.
Either you deploy Ollama in your own server (f.ex. using docker image) or you can pay a host dev service like this one (https://zeabur.com/templates/14MGC6) allowing one-click deployment starting from $5/month. Then when it's running, in practice, you just enter the base URL in Triplo's local LLM settings. That being said, better wait for Felipe's confirmation.
Yes, thanks, seems they use Docker/Nginx and Ollama and OpenUI I believe. I just have to figure out min specs for this.
Confirming. You'll be able to use a LLM Server (Ollama/LMStudio) running remotely. How big shall be the server? I'm sorry but I don't know.
Just google or perplex "minimum server specs ollama deployment RAM CPU"