Easy to Use and Intuitive Web Scraper that can be Automated
Browse.ai has some really cool features that are going to help me monitor my own and competitors' reviews in directories
I can also scrape data from any site in the world, and turn that site into an API!
Pros - Whats cool?
1) UI & Ease of Use - It is clean, well guided, and to be honest, it is pretty easy to point and click and use the robots. I actually like using browse.ai out of the competing service providers such as hexomatic or webscraper.io free chrome extension.
2) Focused in one thing - Unlike other tools, browse.ai is focused on one thing and it does it well - scraping and monitoring.
3) No random surprises - Once you have purchased the tasks - they're yours to use. You're not paying more money for premium tasks etc. Yeah, the tasks consumptions are a bit more comparatively more for premium services, but they're not forcing you to spend cash like other competitors. This is a great approach since they're not giving unlimited-unlimited but they're not forcing you to spend dollars either.
Cons -
1) Locked Email - I hate tools that do this. Owners should not be held hostage to locked emails.
2) Pricier - The cost per task is more higher to competitors that have launched in AppSumo, but I'm okay with the fact because of their UI and Easy of Use. Most importantly, unlike other tools, I did not spend over half a day figuring stuff out, it was super intuitive. Also - the founder's reasoning for limits is well explained and understandable.
3) Not as feature Rich - Features such as nested scrapes are missing, where I can use the results from one scrape to initiate another scrape. Example scrape a list of archive of appsumo products and then scrape details from individual pages.
Another use case - I should be able to scrape companies from Clutch, save their info, get their social information, get their LinkedIn profile, use your LinkedIn profile scraper from the previous result and then run the scraper to get the data etc.
Basically, find out a way to connect your robots to create nested robots.
Overall, I like the tool, the team. The founder is honest and consistent throughout his communication. It can clearly be seen that the original LTD buyers are happy from their Product Hunt sales and came back to AppSumo to buy more.
I'd recommend the team to keep at it and make this scraping tool awesome in every possible way. Reach out to customers, improve product, find out use cases, create tutorials on use cases and help us create a win-win situation.

Ardy_BrowseAI
May 9, 2024Thank you for the thorough review and encouraging words ❤️
Let me share my thoughts on the cons you mentioned:
1) I understand and I'm sorry you feel that way. Let me give you our perspective: We launched on AppSumo because we wanted engaged power users who give us feedback and help us improve the product for them and everyone else. Given the limited number of codes we allocated to this launch, we wanted to avoid those who buy an LTD and never use it just to sell it for 100x the price 2 years later. And our research showed this is probably the best way to minimize those inactive accounts. I believe this will benefit everyone, including our LTD customers, in the long run.
2) It seems you've seen my explanation for this, but just in case other readers haven't, here it is: We would love to offer more credits! but because we run tasks in a browser, emulate human interactions (scroll, click, type, pause, ...), and use expensive residential proxies, we just can't do it yet. One of our priorities for the next 6 months is to add more optimizations to reduce our infrastructure costs by 50-90%. Once we do that, we may be able to increase credits across the board, but we can't make any promises yet. Not before we implement some of those optimizations and measure their results in production.
3) Right now, there are three no-code ways to pass the results of one robot (a) to another (b):
A. Download robot (a)'s captured data as CSV and bulk run robot (b) using it.
B. Use an integration tool like Zapier/Make/Pabbly and create a workflow that listens to robot (a)'s finished tasks and uses their data to run a task with robot (b).
C. Integrate robot (a) with Google Sheets, and then use the Google Sheets addon to run robot (b) with robot (a)'s captured data as input with a few clicks.
But that's not it! We have a "Workflows" feature in private beta that lets you connect two robots together and avoid all the workarounds above. The backend works well and we're just working on creating the easiest interface for it.
All those that you mentioned (reaching out to customers, improving product, discovering use cases and creating tutorials for them) are things we're actively working on. I think you're going to be very excited to see everything we release over the next few months 🚀 ❤️