Q: Roadmap?

I bought Eurekaa last year, but I haven't seen much improvement and updates?

When will it integrate more advanced models, like perplexity, to search for real / useful learning data in the course creation module? Maybe intégrating openrouter to let us bring our own API key?

Thank you

Sebastien.RPLUSMay 7, 2025
Founder Team
Larry_Eurekaa

Larry_Eurekaa

May 7, 2025

A: Hey. Thanks for being a customer. Since this time last year, we’ve rolled out around 10 updates — you can find the details in our release notes on the website. The more significant the feature (like Rapid Publisher), the longer it naturally takes to build, QA test, and roll out, often through beta releases.

On the topic of multiple models: Perplexity isn’t an AI model itself; it’s an interface that routes queries to different underlying LLMs. In Eurekaa, much of the content flows live into our database, so swapping between entirely different LLMs isn’t straightforward or necessarily beneficial. Each model brings unique challenges around integration, output formatting, and performance tuning. Eurekaa is a multi-tool platform where AI is just one part of a larger toolkit. It’s less about plugging in raw models and more about configuring and training them to fit seamlessly within the product ecosystem. Many of our core features are tightly integrated into specific UI frames, meaning the outputs need to work smoothly within that structured environment — and that’s no small feat. That said, if you have a specific idea or big use case do send the details to feedback@eurekaa.io

Share
Helpful?
Log in to join the conversation
Posted: May 7, 2025

Thank you for the reply. Still, the course builder is a simple AI model that hallucinates content without doing any deep search (it just reads titles from the database, not the actual course content). Hence the quality of created courses is currently as basic as when prompting ChatGPT. When using perplexity we can, at least, recreate more detailed content.

Founder
Posted: May 7, 2025

We are using the higher model LLM but hallucinations can happen regardless (even with RAG) depending on how empirical the topic is. In the Editor if a transfer is not used from Lesson Architect (or RAG) it might have less to use. We ask for a lot of context before creating the content in LAi to include background, focus and exclusions.