1. When website data "updates" will it scrape automatically or schedule? 1b. How many credits are used for each of the "UPDATES" - verses the 10-30 credits it costs for the original scrape? 1c. Are credits used to keep up with "scanning for live updates/changes"? 1d. Will the system alert us FIRST about changes on the original data - BEFORE - initiating an update automatically - or manually ?
2. Are their templates, custom templates? 2b. Can we save workflows/prompts to reuse later? 2c. Can we automate webhooks to push to Projects or to G-Drive etc work flows? 2d. VSCode & Zapier Integration?
3. Are Spidy Agents = to a Workflow's for deployment? Ex: 5 Spidy agents/day = 5 workflows active for the day, either scraping OR on update watch/alert for changes?
Gratis - I know it's probably not easy with the language barrier. :) But, I asked about "HOW" the credits are used in "updates". Not how many we get with each tier, thats obvious from the deal table. I would love an answer please, and im sure many would use this app more if the communication Gap could be filled in somehow. :)
Q: Does it work on websites needs subscription and needs to be logged in?
Does it work on websites needs subscription and needs to be logged in?
Do the credits renew or not? I would prefer a lifetime deal to be honest. I tested other solutions like scrapingant and even they provide lifetime free 10'000 API credits
Q: Nice product :) Questions: Credits - Templates - Prompts - UPDATES - Automations - Storage - Spidy Agents & Proxies
A few questions:
1. When website data "updates" will it scrape automatically or schedule?
1b. How many credits are used for each of the "UPDATES" - verses the 10-30 credits it costs for the original scrape?
1c. Are credits used to keep up with "scanning for live updates/changes"?
1d. Will the system alert us FIRST about changes on the original data - BEFORE - initiating an update automatically - or manually ?
2. Are their templates, custom templates?
2b. Can we save workflows/prompts to reuse later?
2c. Can we automate webhooks to push to Projects or to G-Drive etc work flows?
2d. VSCode & Zapier Integration?
3. Are Spidy Agents = to a Workflow's for deployment?
Ex: 5 Spidy agents/day = 5 workflows active for the day, either scraping OR on update watch/alert for changes?
4. Which Proxy plan are WE on?
Thank you :)
Marco_ScrapeGraphAI
Apr 21, 2025A: with spicy agent you have:
tier 1: 20 chats a day
2: 50
3: 100
Share ScrapeGraphAI
Gratis - I know it's probably not easy with the language barrier. :) But, I asked about "HOW" the credits are used in "updates". Not how many we get with each tier, thats obvious from the deal table. I would love an answer please, and im sure many would use this app more if the communication Gap could be filled in somehow. :)
Q: Does it work on websites needs subscription and needs to be logged in?
Does it work on websites needs subscription and needs to be logged in?
Marco_ScrapeGraphAI
Apr 21, 2025A: you have to make th elgoinm by yourself
Share ScrapeGraphAI
Q: Websites with login
Is it working with websites that needs to be logged in?
Can I navigate between multiple pages inside one website to collect different data for each one?
Thanks
Share ScrapeGraphAI
Q: Facebook Scraping?
Is it possible to scrape posts from public facebook groups? Thanks
Marco_ScrapeGraphAI
Apr 12, 2025A: yes, you should have the link,
take a look here: https://scrapegraphai.com/blog/facebook-smart-scraper/
Share ScrapeGraphAI
Q: Don't understand this Lifetime Deal
Do the credits renew or not?
I would prefer a lifetime deal to be honest. I tested other solutions like scrapingant and even they provide lifetime free 10'000 API credits
Marco_ScrapeGraphAI
Apr 2, 2025A: the credits last till you finish them
Share ScrapeGraphAI