I am interest in your tool for a specific need. I run an aggregator website and I need to monitor specific pages on our providers websites to be able to automatocally change the prices on our wordpress website when the providers change their public prices on their websites. Can I do it with your tool ? Thanks
A: Hi there. Yes, you should be able to do that. That is a pretty common use case actually (supplier monitoring, for example). You'd just need a 3rd tool like Zapier, Pabbly or Make.com to connect Browse AI to your Wordpress site and update your database.
Q: Hi, I did not fully understand if your zapier integration can do the following:
- Zapier send an url to ...
browse ai (zap1) - browse ai will log in this website (always the same log-in but different subsites) - browse ai will scrape the content - whenever new content of this website is captured by browse ai, it will provide the content to zapier (zap 2)
This would be my use case as I need to collect a lot of information on sites with my log-in for my research. Thanks Olivier
What you described sounds straight-forward with Browse AI:
- Use the Execute Task action on Zapier to have Browse AI log into the site and scrape the content. - Use the Executed task trigger on Zapier to receive the content that Browse AI captures.
Unfortunately not, credits reset every month, they do not carry over. This is because as a startup, we need our infrastructure expenses to be predictable. We have also used the historical average credit usage rate to offer more credits on these plans.
I want iiiit! Can it auto log me into websites? Because some sites I wanna scrape automatically logs me out every hour or something 😪 (If not, any advise for a workaround?) Thanks! ~Lorgen
btw, at least one site I know doesn't have robot detection (it's my company profile). I want it to send me my attendance, etc. data (need to use external solutions because our system sucks 😂). Thanks!
Q: Hi sorry if I missed it.
How do you charge for the monitoring? Does it take any credits and if yes how is it done?
Q: Can you look into the better business B B B website?
They have a popup at first that says "Show me search results for X or Y" with two radio button options and then you click a continue button to get search results.
When your new window opens for BrowseAI to record, your purple "Okay Understood" button wont let me click it to start recording.
It's like it's frozen, but only happens with that website. Others work fine so far. I'm guessing it's a bug or maybe they implemented some sort of anti-scraping type thing since they're big enough.
A: I think I know what issue you're referring to – on some sites, their modals prevent the user from clicking on anything else and that's why you can't interact with the robot.
The good news is that with the new robot studio we're developing this issue (and a lot more) will be solved as we will render the robot on a different layer so it does not conflict with the page in any way. We're going to release this new robot studio in Q4.
A: Hi there. Not at the moment. We only have "premium sites" which are listed here and cost extra credits due to their additional cost for us to run automations on: https://help.browse.ai/how-are-credits-calculated There are no premium "credits".
A: For Hexomatic, we have a comparison page here: https://www.browse.ai/vs/hexomatic From my perspective, Hexomatic is more focused on a pre-defined set of automations while Browse AI is more focused on giving you easy tools to set up any automation you're looking for on any website. Hexomatic requires you to purchase "Premium credits" to perform some actions while with Browse AI all credits are the same, you just have to pay extra credits for some premium sites: https://help.browse.ai/how-are-credits-calculated
What I've heard from a lot of users is that UiPath is extremely hard to setup and maintain for *web* automations. The ease of use is not even comparable to Browse AI, I've been told. Browse AI also comes with residential proxies, captcha solving, and a lot more that UiPath does not offer AFAIK.
I haven't used Robomotion myself and haven't heard much about it, so can't comment on that one.
A: E-commerce product availability/price monitoring is one of our most popular use cases!
You can set up a robot to monitor products on any shop. Then you can connect that to your site using Zapier/Pabbly/Make.com or custom code and take custom actions based on it. Is that what you're looking for?
A: We're releasing a new experimental feature in our Google Sheets add-on which will delete duplicate/old values if you need it to. That should help you achieve what you're looking for, am I right?
A: Hi there! On our team, I've been responsible for replying to most questions on AppSumo and I was away for a few days moving houses and dealing with a flu/cold! I just started replying again. Sorry about the delay!
Q: Good morning and congratulations for the platform and the software.
I am currently a user of a competitor and I must say that I am very satisfied. The only drawback is that it is mandatory to buy credits to get what I need, that is the translation with Google or deepl engines. I have read the roadmap carefully and I don't seem to have seen translation automation for the scraped content and I wanted to know if this can be implemented in the future and included in the roadmap. That would be important for the investment decision in your company in my use case scenario. If, as i hope, the answer is yes , is there any ETA and credit consumption hypothesis?
At this time, we don't have "built-in translation automation for scraped content" on our requested features. You can request that here: https://www.browse.ai/product-roadmap
Technically, you could build a robot right now that enters a text in Google Translate, for example, and captures the result. With the "robot chaining" feature that's under development right now, you will be able to chain any robot with this robot to translate texts as soon as they are scraped.
We could also add a prebuilt robot for Google Translate if that would make this easier for a lot of users. 🤔
Ok thanks a lot for the answer. I will invest to better support you guys and future development. If a robot could be pre-trained to translate (Google Translate and DeepL) would be a game changer for me (and i guess other users), and in the meantime maybe a video tutorial for how to chain two bots for translation as soon as the function is out of beta. Wish all the best
Q: It's a bummer all the tiers are tied to the Starter plan only, which means Sumolings are most likely to be ...
A: I understand where that thought comes from. But if you scroll down, you can see our ProductHunt LTD owners who purchased a year ago and now purchased the AppSumo LTD as well (they asked questions about whether they could stack the two deals). A few months ago, they all received over 2x bonus additional lifetime credits and got plenty more features than we were offering originally. Because we truly appreciate early supporters and rely on their word of mouth to grow.
We have been very transparent about the plan differences and the whys if you look at other questions on this topic here. We will ensure that our LTD owners are happy with their plans long term. It just does not make financial sense to offer features like Teams (which is only needed for large teams) in the LTD. LTD owners need the business to be thriving so the product keeps getting developed and more mature. That's why we left out a couple of features (like Teams) that are suitable for larger teams and scales which will help us with funding product development.
We thoroughly evaluated 3 different models a few months ago: - Task credits - Page credits - Record credits
We landed on Record credits for many reasons some of which are: - It's much easier for users we've talked to, to estimate the number of records they're going to extract rather than number of pages. So it is more clear what you would get with each plan. - Number of pages you scrape can also change over time outside of your control. If a site changes their max list items in every page from 30 to 20, your page credit cost would suddenly go 50% higher. We wanted the cost to be as predictable as possible so organizations can easily budget for at least a year. - (this one is more of a secret for now...!) We're working on adding more layers to the application, for example a data layer that will let you enter a URL to scrape and see the data you need appear next to it right away, essentially abstracting away the scraping part and just focusing on the input/output data. In that scenario, there will be no "pages" for the user, it will just be records of data.
Q: Hi,
We're looking to extract information about companies and their team within, for which we have a list ...
(sheet), and we use Zoom Info to find the information we're looking for. This calls for us to toggle between tabs (The sheet and ZoomInfo) to complete this successfully. I'd love to know if there is any way we can have your bot do this for us. From what I understand, we can only record or train the bot to perform tasks within one site. I'm sorry if my query left you puzzled. It'd be great to get a response, even if this is confusing. I will try to explain better or will get someone from the team to do the explaining.
Your workflow seems pretty straightforward to automate using Browse AI. Even if it involves multiple sites, you can create multiple robots and orchestrate them to aggregate all the data you need at the same time.
A: Hi there, we are Software-as-a-Service (SaaS) business. You will get lifetime access to the software as a user with certain limits based on your tier.
If it's 45,000 list items along with a detail page for each, that would require 90,000 credits, which with the highest tier (20,000/mo) would take about 4 months.
With the Starter plan (including the LTD), we provide email support with a response time of less than 2 business days which is more often less than an hour. Our support team, located in 3 different timezones, can look into any issues you run into and help you create the robot you need.
Q: Hi.
I am interest in your tool for a specific need. I run an aggregator website and I need to monitor specific pages on our providers websites to be able to automatocally change the prices on our wordpress website when the providers change their public prices on their websites. Can I do it with your tool ? Thanks
Ardy_BrowseAI
May 15, 2024A: Hi there. Yes, you should be able to do that. That is a pretty common use case actually (supplier monitoring, for example). You'd just need a 3rd tool like Zapier, Pabbly or Make.com to connect Browse AI to your Wordpress site and update your database.
Share Browse AI
Q: Hi, I did not fully understand if your zapier integration can do the following: - Zapier send an url to ...
browse ai (zap1)
- browse ai will log in this website (always the same log-in but different subsites)
- browse ai will scrape the content
- whenever new content of this website is captured by browse ai, it will provide the content to zapier (zap 2)
This would be my use case as I need to collect a lot of information on sites with my log-in for my research.
Thanks
Olivier
Ardy_BrowseAI
May 15, 2024A: Hi Olivier,
What you described sounds straight-forward with Browse AI:
- Use the Execute Task action on Zapier to have Browse AI log into the site and scrape the content.
- Use the Executed task trigger on Zapier to receive the content that Browse AI captures.
Does that answer your question?
Share Browse AI
Q: I currently have the 3 codes package, but wanted to expand to the 4 or 5.
How should I do that? It seems that codes aren't stackable but I also don't want to buy the full package to have two accounts. Please advise.
Ardy_BrowseAI
May 15, 2024A: Hi there. You can stack up to 8 codes on one account. What made you think you can't stack more than 3 codes? 🤔
Share Browse AI
Q: Hi there, Congrats on the software.
Will the unused credits be carried forward for future usage?
If not, will you consider adding it to this deal?
Thank you.
Ardy_BrowseAI
May 15, 2024A: Hi there,
Unfortunately not, credits reset every month, they do not carry over. This is because as a startup, we need our infrastructure expenses to be predictable. We have also used the historical average credit usage rate to offer more credits on these plans.
Share Browse AI
Q: Good day!
I want iiiit!
Can it auto log me into websites? Because some sites I wanna scrape automatically logs me out every hour or something 😪 (If not, any advise for a workaround?)
Thanks!
~Lorgen
Ardy_BrowseAI
May 15, 2024A: Absolutely! We have a guide on this:
https://help.browse.ai/can-i-login-to-a-website-while-recording-my-task
Share Browse AI
btw, at least one site I know doesn't have robot detection (it's my company profile).
I want it to send me my attendance, etc. data (need to use external solutions because our system sucks 😂).
Thanks!
Q: Hi sorry if I missed it.
How do you charge for the monitoring? Does it take any credits and if yes how is it done?
Ardy_BrowseAI
May 15, 2024A: Hi Justin. We have a guide on this topic along with examples here:
https://help.browse.ai/how-are-credits-calculated
(see example #3)
Share Browse AI
Q: Can you look into the better business B B B website?
They have a popup at first that says "Show me search results for X or Y" with two radio button options and then you click a continue button to get search results.
When your new window opens for BrowseAI to record, your purple "Okay Understood" button wont let me click it to start recording.
It's like it's frozen, but only happens with that website. Others work fine so far. I'm guessing it's a bug or maybe they implemented some sort of anti-scraping type thing since they're big enough.
Thoughts?
Ardy_BrowseAI
May 15, 2024A: I think I know what issue you're referring to – on some sites, their modals prevent the user from clicking on anything else and that's why you can't interact with the robot.
The good news is that with the new robot studio we're developing this issue (and a lot more) will be solved as we will render the robot on a different layer so it does not conflict with the page in any way. We're going to release this new robot studio in Q4.
Share Browse AI
Q: Hi Do you have a list of the Premium Automations that is not part of this deal?
Ardy_BrowseAI
May 15, 2024A: Hi there. Not at the moment. We only have "premium sites" which are listed here and cost extra credits due to their additional cost for us to run automations on:
https://help.browse.ai/how-are-credits-calculated
There are no premium "credits".
Share Browse AI
Q: How does this compare to Robomotion, UiPath, Hexamatic etc?
Ardy_BrowseAI
May 15, 2024A: For Hexomatic, we have a comparison page here:
https://www.browse.ai/vs/hexomatic
From my perspective, Hexomatic is more focused on a pre-defined set of automations while Browse AI is more focused on giving you easy tools to set up any automation you're looking for on any website.
Hexomatic requires you to purchase "Premium credits" to perform some actions while with Browse AI all credits are the same, you just have to pay extra credits for some premium sites: https://help.browse.ai/how-are-credits-calculated
What I've heard from a lot of users is that UiPath is extremely hard to setup and maintain for *web* automations. The ease of use is not even comparable to Browse AI, I've been told. Browse AI also comes with residential proxies, captcha solving, and a lot more that UiPath does not offer AFAIK.
I haven't used Robomotion myself and haven't heard much about it, so can't comment on that one.
Did I answer your question?
Share Browse AI
Q: Can I program this to do what pricezag.
com does? It will scrape an eCommerce site to check if its prices have changed compared to my site and then send me a notification.
Ardy_BrowseAI
May 15, 2024A: E-commerce product availability/price monitoring is one of our most popular use cases!
You can set up a robot to monitor products on any shop. Then you can connect that to your site using Zapier/Pabbly/Make.com or custom code and take custom actions based on it. Is that what you're looking for?
Share Browse AI
Q: Is it possible to overwrite the existing cell value and not create new one below?
I was talking about google sheet when integrated.. I noticed that they create new value below every time the monitor is triggered.
Ardy_BrowseAI
May 15, 2024A: We're releasing a new experimental feature in our Google Sheets add-on which will delete duplicate/old values if you need it to. That should help you achieve what you're looking for, am I right?
Share Browse AI
Q: I was interested in using a few of your pre-built robots, but all the robots that i want say, "This prebuilt ...
robot is currently in private beta"
There is a form to request access, how often do you grant access? Would i have to wait long and what type of cryteria are required? Thanks!
Ardy_BrowseAI
May 15, 2024A: Hi Ben, we only have a handful of prebuilt robots in "private beta". I'm guessing it's one of the Google Maps ones?
We're looking to release them all in September. Does that work for you?
Share Browse AI
Q: Just bought and starting to get twitchy seeing 5 days and no responses to good questions that seem to make a lot of sense?
What gives where is the product owner/rep?
Ardy_BrowseAI
May 15, 2024A: Hi there! On our team, I've been responsible for replying to most questions on AppSumo and I was away for a few days moving houses and dealing with a flu/cold! I just started replying again. Sorry about the delay!
Share Browse AI
Q: Good morning and congratulations for the platform and the software.
I am currently a user of a competitor and I must say that I am very satisfied. The only drawback is that it is mandatory to buy credits to get what I need, that is the translation with Google or deepl engines. I have read the roadmap carefully and I don't seem to have seen translation automation for the scraped content and I wanted to know if this can be implemented in the future and included in the roadmap. That would be important for the investment decision in your company in my use case scenario. If, as i hope, the answer is yes , is there any ETA and credit consumption hypothesis?
thank you
Keep up the good work
Ardy_BrowseAI
May 14, 2024A: Hi there. Thank you for the positive words 🙏
At this time, we don't have "built-in translation automation for scraped content" on our requested features. You can request that here:
https://www.browse.ai/product-roadmap
Technically, you could build a robot right now that enters a text in Google Translate, for example, and captures the result. With the "robot chaining" feature that's under development right now, you will be able to chain any robot with this robot to translate texts as soon as they are scraped.
We could also add a prebuilt robot for Google Translate if that would make this easier for a lot of users. 🤔
Hope that answers your question.
Share Browse AI
Verified purchaser
Ok thanks a lot for the answer.
I will invest to better support you guys and future development.
If a robot could be pre-trained to translate (Google Translate and DeepL) would be a game changer for me (and i guess other users), and in the meantime maybe a video tutorial for how to chain two bots for translation as soon as the function is out of beta.
Wish all the best
Q: It's a bummer all the tiers are tied to the Starter plan only, which means Sumolings are most likely to be ...
abandoned sooner (in a few months or so)
Ardy_BrowseAI
May 15, 2024A: I understand where that thought comes from. But if you scroll down, you can see our ProductHunt LTD owners who purchased a year ago and now purchased the AppSumo LTD as well (they asked questions about whether they could stack the two deals). A few months ago, they all received over 2x bonus additional lifetime credits and got plenty more features than we were offering originally. Because we truly appreciate early supporters and rely on their word of mouth to grow.
We have been very transparent about the plan differences and the whys if you look at other questions on this topic here. We will ensure that our LTD owners are happy with their plans long term. It just does not make financial sense to offer features like Teams (which is only needed for large teams) in the LTD. LTD owners need the business to be thriving so the product keeps getting developed and more mature. That's why we left out a couple of features (like Teams) that are suitable for larger teams and scales which will help us with funding product development.
Share Browse AI
Q: Hi A lot of cloud scrapers use page credits ( e.
g. https://webscraper.io/) but you use record credits which is more expensive. Can you justify or explain why you used this model? Thanks
Ardy_BrowseAI
May 14, 2024A: Hey Justin,
We thoroughly evaluated 3 different models a few months ago:
- Task credits
- Page credits
- Record credits
We landed on Record credits for many reasons some of which are:
- It's much easier for users we've talked to, to estimate the number of records they're going to extract rather than number of pages. So it is more clear what you would get with each plan.
- Number of pages you scrape can also change over time outside of your control. If a site changes their max list items in every page from 30 to 20, your page credit cost would suddenly go 50% higher. We wanted the cost to be as predictable as possible so organizations can easily budget for at least a year.
- (this one is more of a secret for now...!) We're working on adding more layers to the application, for example a data layer that will let you enter a URL to scrape and see the data you need appear next to it right away, essentially abstracting away the scraping part and just focusing on the input/output data. In that scenario, there will be no "pages" for the user, it will just be records of data.
Hope that answers your question.
Share Browse AI
Verified purchaser
Thanks for alot for your feedback. Appreciated
Q: Hi, We're looking to extract information about companies and their team within, for which we have a list ...
(sheet), and we use Zoom Info to find the information we're looking for. This calls for us to toggle between tabs (The sheet and ZoomInfo) to complete this successfully. I'd love to know if there is any way we can have your bot do this for us. From what I understand, we can only record or train the bot to perform tasks within one site.
I'm sorry if my query left you puzzled. It'd be great to get a response, even if this is confusing. I will try to explain better or will get someone from the team to do the explaining.
Ardy_BrowseAI
May 15, 2024A: Hi Anisha,
I replied to your email this morning!
Your workflow seems pretty straightforward to automate using Browse AI. Even if it involves multiple sites, you can create multiple robots and orchestrate them to aggregate all the data you need at the same time.
Share Browse AI
Q: what do i get when i buy, a source code or a license?
Ardy_BrowseAI
May 15, 2024A: Hi there, we are Software-as-a-Service (SaaS) business. You will get lifetime access to the software as a user with certain limits based on your tier.
Share Browse AI
Q: Hi there, We want scrape a public educational directory which contains some 45,000 programs (each with its details page).
So if I get the highest tier, it will take 3 months to scrape. Is that correct?
Another question: What type of support you provide to LTD users.
Thanks in advance,
Dr.Mo
Thanks
Ardy_BrowseAI
May 15, 2024A: Hi there,
If it's 45,000 list items along with a detail page for each, that would require 90,000 credits, which with the highest tier (20,000/mo) would take about 4 months.
With the Starter plan (including the LTD), we provide email support with a response time of less than 2 business days which is more often less than an hour.
Our support team, located in 3 different timezones, can look into any issues you run into and help you create the robot you need.
Did I answer your questions?
Ardy
Share Browse AI
Q: I've sent an email to your support alias and waiting to hear back.
Would you please respond to my queries via email?
Ardy_BrowseAI
May 15, 2024A: Hi Nik,
Seems like our team missed your billing-related question. Sorry about that! I just replied to your ticket.
Ardy
Share Browse AI