ScrapeGraphAI

Product details

Q: Nice product :) Questions: Credits - Templates - Prompts - UPDATES - Automations - Storage - Spidy Agents & Proxies

A few questions:

1. When website data "updates" will it scrape automatically or schedule?
1b. How many credits are used for each of the "UPDATES" - verses the 10-30 credits it costs for the original scrape?
1c. Are credits used to keep up with "scanning for live updates/changes"?
1d. Will the system alert us FIRST about changes on the original data - BEFORE - initiating an update automatically - or manually ?

2. Are their templates, custom templates?
2b. Can we save workflows/prompts to reuse later?
2c. Can we automate webhooks to push to Projects or to G-Drive etc work flows?
2d. VSCode & Zapier Integration?

3. Are Spidy Agents = to a Workflow's for deployment?
Ex: 5 Spidy agents/day = 5 workflows active for the day, either scraping OR on update watch/alert for changes?

4. Which Proxy plan are WE on?

Thank you :)

myarchivables.infoPLUSEdited Apr 20, 2025
Founder Team
Marco_ScrapeGraphAI

Marco_ScrapeGraphAI

Apr 21, 2025

A: with spicy agent you have:
tier 1: 20 chats a day
2: 50
3: 100

Share
Helpful?
Log in to join the conversation
Posted: Apr 22, 2025

Gratis - I know it's probably not easy with the language barrier. :) But, I asked about "HOW" the credits are used in "updates". Not how many we get with each tier, thats obvious from the deal table. I would love an answer please, and im sure many would use this app more if the communication Gap could be filled in somehow. :)