Q: ANy guidline for setting up Local Chat?
It's holding in verifying the Installation of Ollama.
(My installation process is finished)
Verified purchaser
Jon_Socrates
Sep 30, 2025A: Hi there,
I apologize for the delay in getting back to you. I’m currently dealing with something personal. Just to reassure you, Socrates is up and running and will remain so for the foreseeable future. We’ll circle back with you as soon as I can.
Thanks for your patience and understanding.
Best,
Jon
Verified purchaser
There is no response after I install Ollama...
Verified purchaser
You need to install all the required models, the embed, vision and chat model to chat with your documents if not Ollama won't work.
Verified purchaser
There's no guidelines since this is his application and his promised youtube tutorials have never arrived. The things I think that can be done would basically be to navigate to olllama and install it yourself. From there try to install ollama from Socrates and I think it should work. If you can, check if there's any folders named Ollama in your program files, if there are, Ollama shouldbeinstalled
Verified purchaser
Thanks
Last week, it was holding in the last step 5: "Pull Model" after the latest update of Socrates. And now, it can be verified in this last step. I'm not sure if I can add an embedded model or not for the local chat. It's processing...
Verified purchaser
Final response: "Failed to download model"
Verified purchaser
It looks like it is not possible to add another model at this moment; only "nomic-embed-text" works.
Verified purchaser
It looks like it is not possible to add another model at this moment; only "nomic-embed-text" works.