Mac-Friendly, DOM-Rendering Spidering Tool for Multiple Users
-
Hello!
I am looking for a spidering tool that:
- Is Mac-friendly
- Can render the DOM and find JS links
- Can spider password-protected sites (prompts for password and then continues spider, etc.)
- Has competitive pricing for 8+ users.
Screaming Frog is amazing - and maybe we're just going to have to bite the bullet there. But if anyone has any other ideas, I've love to hear them. Thanks!
-
So - after digging around a lot and reading and re-reading every article that popped up for "screaming frog alternative", I've come to the conclusion that for the price, there really is nothing better than Screaming Frog right now.
I was impressed, however, with the incredibly helpful team from Deep Crawl. This enterprise tool is designed for larger websites - whereas Screaming Frog can crap out of your local machine runs out of memory. Because it's a more powerful tool, it's more expensive than Screaming Frog - but if you need an enterprise solution, it's definitely worth looking into. Another big differentiator is that Deep Crawl has no limit to the number of users, which is our primary pain point with Screaming Frog.
-
Right now we're updating SEOSpyder ( http://www.mobiliodevelopment.com/seospyder/ ) for rendering pages but i can't give you timeframe when will be done.
So far memory requirements isn't too high and was crawl 250k site with 8G ram machine.
-
Oh actually something I just realized is that potentially ScreamingFrog can do what you want and it will provide you with access to 8 users, but the setup is complicated. You would need to run it in a big virtual machine on AWS or Google Cloud Platform. That way you can scale the machine so it won't time out and everybody will still have access to it.
Back to your question: I've worked with Deepcrawl, a bit with Ryte and more with Botify. They're all great tools that are able to crawl your site. But you probably already looked into some of them.
-
Oh, interesting - can you help me understand about more about the cloud solution are you using...? Thanks!
-
Going to follow this, as I've been looking for something too. But we went the cloud service, as there is nothing that I acme across that can otherwise fulfill all these needs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice recommendations for enabling multiple languages on your site?
I find that the advice for multi-language sites is always tied with multi-region, but what about US only sites that want to be multi-lingual? What are the best practice recommendations there? HREFLANG tags necessary? TLDs? Do you need to purchase yoursite.us , yoursite.sp , etc.. or would yoursite.com/en yoursite.com/sp suffice? Should the extensions be region based even if the language is the only difference?
Intermediate & Advanced SEO | | emilydavidson0 -
How do I add https version of site to Bing webmaster tools?
I could add my site to Google Webmaster tools with no problems, but when I try to add it in Bing webmaster tools it just redirects me to what I already have. Everything is staying the same but the switch from http to https. Anyone else experienced this? This is what I just received back from Bing and it doesn't seem right- I understand that you switched to the https version of your site and you're now trying to use the Site Move tool. However, in order to do this, you must verify the https version of your site first. You cannot do this because it just redirects you to the dashboard. We thank you for reporting this to us. We've investigated on this matter and can see that you're already put a redirect from the http to the https version of your site. We also checked the /BingSiteAuth.xml file and this also redirects to the https version. At this point, we suggest that you remove the current website (http version) that is verified through Bing Webmaster Tool and add your https domain. When done, use the Site Move tool. Thoughts?
Intermediate & Advanced SEO | | EcommerceSite1 -
Any Product to Offer Users to Embed Pictures with Backlink
Wistia (video hosting) has an embed feature, which can be set up to include a backlink. In other words, a user could embed a video on their site, but would automatically create a back link to the original page where it is posted. Is there a product to do similar with pictures, where I could give users options to easily take the pictures from my website, but it would include a back link to my site when they do use such picture?
Intermediate & Advanced SEO | | khi50 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
Multiple, Partial Redirecting URLs from old SEO company
Received quite a surprise when I gained access to the Google webmaster account and saw 4 domains that are link to my clients domain and the number of links for each domain range between 10,000 and 90,000. Come to find out this was a result of their former agency. The business is very local central. I will use the example of a burger place. They main site is burgers.com and burger places are listed by city and state. Their former agency bought several domains like californiaburgers.com and duplicated the listings for that state on this domain. You can view certain pages of the second domain, but the home page is redirected as are most of the city pages with 301s to the main burgers.com domain. However, there are pages on the additional domains that do not redirect, as they are not duplicated on the main domain so nowhere to redirect. Google has only found four of them but looks like there could be at least 50. Pages that are not redirected are indexed by the engines - but not ranking (at least not well). There is a duplicate content issue, although "limited" in the sense that it really is just the name of the business, address and phone number - there is not much to these listings. What is the best approach to overcome? Right now GWT is showing over 300,000 links, however at least 150,000 to 200,000 of that is from these domains.
Intermediate & Advanced SEO | | LeverSEO0 -
Access denied errors in webmaster tools
I notice today Ihave 2 access denied errors. I checked the help which says: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Therefore I think it may be because I have added a login page for users and googlebot can't access it. I'm using wordpress and presume I need to amend the robots.txt to remove the requirement for google to log in but how do I do that? Unless I'm misunderstanding the problem altogether!
Intermediate & Advanced SEO | | SamCUK0 -
Keyword Research Tool For Local Customers
Hi all, and thanks in advance for your input. I help mostly small local businesses with SEO and other IM strategy, but am having a hard time finding a good tool for local seo searches. For instance, I have a smaller plumber that covers Denver, but really wants to market to some of the suburbs. What is a good tool to try to find search volume for "littleton plumbers" or similar searches? By the way Littleton is a suburb of Denver. Thanks again. Chris
Intermediate & Advanced SEO | | iFuseInternetMarketing0 -
Wordtracker vs Google Keyword Tool
When I find keyword opportunities in Wordtracker, I'll sometimes run them through Adwords Keyword tool only to find that Google says these keywords have 0 search volume. Would you use these keywords even though Google says users aren't searching for them?
Intermediate & Advanced SEO | | nicole.healthline0