Sep 7, 2025

Q: ANy guidline for setting up Local Chat?

It's holding in verifying the Installation of Ollama.
(My installation process is finished)

InnovativeMaxInnovativeMax
InnovativeMaxPLUS

Verified purchaser

Founder Team
Jon_Socrates

Jon_Socrates

Sep 30, 2025

A: Hi there,

I apologize for the delay in getting back to you. I’m currently dealing with something personal. Just to reassure you, Socrates is up and running and will remain so for the foreseeable future. We’ll circle back with you as soon as I can.

Thanks for your patience and understanding.

Best,
Jon

Share
Helpful?
Log in to join the conversation

Verified purchaser

Posted: Sep 17, 2025

There is no response after I install Ollama...

Verified purchaser

Posted: Sep 23, 2025

You need to install all the required models, the embed, vision and chat model to chat with your documents if not Ollama won't work.

Verified purchaser

Posted: Sep 23, 2025

There's no guidelines since this is his application and his promised youtube tutorials have never arrived. The things I think that can be done would basically be to navigate to olllama and install it yourself. From there try to install ollama from Socrates and I think it should work. If you can, check if there's any folders named Ollama in your program files, if there are, Ollama shouldbeinstalled

Verified purchaser

Posted: Sep 24, 2025

Thanks
Last week, it was holding in the last step 5: "Pull Model" after the latest update of Socrates. And now, it can be verified in this last step. I'm not sure if I can add an embedded model or not for the local chat. It's processing...

Verified purchaser

Posted: Sep 24, 2025

Final response: "Failed to download model"

Verified purchaser

Posted: Sep 24, 2025

It looks like it is not possible to add another model at this moment; only "nomic-embed-text" works.

Verified purchaser

Posted: Sep 24, 2025

It looks like it is not possible to add another model at this moment; only "nomic-embed-text" works.