axel02axel02
axel02PLUS
Jul 1, 2024

Q: Tier 3 What is the "Custom AI" Feature?

Founder Team
Greg_Rog

Greg_Rog

Jul 2, 2024

A: Thanks sooo much for asking! 😊

Custom AI is a brand new feature, that we've just deployed to Alice few days ago. In short, it allows to connect your own back-end and use local models (like Ollama).

This is a big one for us. We're developing Alice to be the best front-end for any LLM use. Alice should be able to work with your company's data and connect to your custom AI and back-end. This means your entire knowledge base will be at your fingertips. Plus, you can limit access to data based on different company roles. You'll also be able to use offline, safe, and private models to operate on these knowledge bases.

By default, Alice communicates directly with OpenAI, Anthropic, Groq, Perplexity, or Ollama servers. However, from now on, it is also possible to connect your own back-end application with Custom AI. This makes possible:

- Connecting your own knowledge base
- Integrating your own logic for tool handling
- Implementing advanced "AI Agents" logic
- Connecting your own language models

This is probably great news for all a bit more technical or those working in companies with back-end developers on board. In practice, this is not that scary, especially with an example of a server in Node.js technology can we created for this feature:

But please, remember: This feature is currently experimental and may not work correctly 🤪 But we will surely improve it over time. We're currently running a few experimental Alice implementations with bigger companies, including their knowledge bases in RAGs.

Have fun!

Share
Helpful?
8
Log in to join the conversation