Hmmm?! Ok .., but....
My experience.
It is immediately better than Afforia, but slightly worse than LMNotebook.
Local use:
I use GPT4ALL and generate answers quickly and easily with various local models (LLAMA etc.).
With Socrates, the speed collapses, i.e. I receive output in slow motion, which is not usable for me and also not comprehensible with the same hardware and prompt used. But it's still a beta ;-)
Online use:
Very fast and if you don't want to use Google, Afforia or other beishers, a very good solution for querying documents.
Pros:
Everything works and I think the desktop app is great, keep it up!
Con:
I have several documents on my laptop trained as a chat for an “online” chat via the app. I also used the Deep function for these documents.
Now I wanted to continue editing on my desktop computer, but unfortunately, despite having an identical account (Tier2), I don't have access to the chats that have already been created, so of course I don't have access to the trained data either.
If I want to continue my work on the desktop, I would not only have to create everything again, but also pay for a deep training again. Since I had already spent 750 documents on my laptop, I would have to spend the same amount again just to be able to work with the same data.
Hmmm? Not good.
Jon_Socrates
Nov 8, 2024Hello, thanks for the feedback.
1. Syncing Desktop & Web chats is the first thing on our roadmap: https://roadmap.asksocrates.app/ -- should be shipped within 1-2 weeks.
We originally didn't sync the 2 to maximize privacy (docs on the desktop app almost entirely don't touch our servers), but clearly users want this feature, so it will be done!
2. Re local use: Yes, we've identified the issue that the libraries do not currently utilize GPU chips (if you have one). We have a different backend than most due to our Deep Dive tech, but we're working on a fix.