AgenticFlow

Product details

Q: Do you have plan to add more pro level llms that will use agentic credits?

jsamplesjrPLUSJun 9, 2025
Founder Team
SeanP_AgenticFlowAI

SeanP_AgenticFlowAI

Jun 11, 2025

A: Hey Jsamplesjr,

That's a great question and something we're constantly evaluating!

Right now, our AgenticFlow credits are designed to cover the platform's operational costs and the usage of our built-in, economical LLMs (like GPT-4o-mini, Gemini Flash 2.0, DeepSeek V3). These models offer a fantastic balance of capability and cost-efficiency for a wide range of everyday automation tasks.

For the more pro-level, SOTA (State-Of-The-Art) LLMs (like the full GPT-4/GPT-4.1, Claude 3.7 Sonnet/Opus, latest Gemini Pro/Ultra, etc.), their direct underlying costs from the providers are significantly higher and can fluctuate.

Our Current Approach for Pro LLMs:
To give you the most flexibility and cost-effectiveness for these, we offer:

BYOK (Bring Your Own Key): You connect your own API key from providers like OpenAI, Anthropic, Google, etc. You pay them directly at their standard rates for token usage, and your AgenticFlow credits only cover the small, fixed cost for the step execution on our platform. This ensures you benefit directly if/when their prices drop.
Pixel ML OpenRouter: This is our optional gateway. It provides access to many of these pro-level models using separate, pay-as-you-go Pixel ML credits (which roll over). We pass through the model provider's cost with a minimal fee for taxes/transactions.
Future Plans for More Pro LLMs on AgenticFlow Credits?
While we don't have immediate plans to bundle the current very top-tier SOTA models directly into the base AgenticFlow credits (as it would significantly increase the LTD price or the per-credit cost for everyone), we are always:

Monitoring LLM Pricing: As the cost of these powerful models decreases from the providers, it becomes more feasible for us to consider integrating them more directly into our credit system or offering them at very competitive rates via the OpenRouter.
Optimizing Our Built-in Offerings: We continually assess which "economical but powerful" models offer the best value to include as built-in options covered by your standard credits.
Community Feedback: We listen closely to what models our users need most. If there's strong demand for a specific "pro-level" model to be more tightly integrated, we'll explore the economics. You can always suggest this on our roadmap: https://agenticflow.featurebase.app/
In short: For now, the most cost-effective way to use pro-level LLMs is via BYOK or our transparent Pixel ML OpenRouter. We aim to keep our built-in selection powerful and economical, and will adapt as the AI landscape and pricing evolve.

Hope this gives you a clear picture!
— Sean

Share
Helpful?
Log in to join the conversation