Thinking aloud - what if WE could run rogerbot from our desktops?
-
Total, total noob question, I know - but is rogerbot performance bound because of bandwidth and processing capacity? I understand if it is, but I am wondering for those of us with very large sites if we would be able to offload the burden on SEOmoz resources by running our own local licensed version of rogerbot, crawl the sites we want and the upload the data to SEOmoz for analysis.
If this was possible would we be getting more immediate results?
-
On the topic of a private crawl (or distributed crawl), these are cool ideas, but not something we currently have in our plans. Having the crawl centralized allows us to store historic data and ensure polite crawling. This may take a little extra time (we are indeed doing a lot of crawls, as well as processing them and retrieving link data for each of them), but we are actively working on on our infrastructure to reduce our crawling and processing time.
While the first crawl does take a number of days, subsequent crawls are started on the same day each week, and should take roughly the same amount of time to complete, controlling for external factors. So in general you should have fresh crawl data right around weekly, give or take a day or two.
As for your specific crawls, I'd be happy to look into them for you. I'll send you a separate email to discuss.
-
Still waiting for my custom crawl launched on April 28th (it's May 2nd) to complete... seriously? If SEOmoz is overwhelmed, first of all, congratualations on being so popular, but I am not getting any timely data at all.
-
Or, instead of running Rogerbot locally, we could run some distributed processing (ie. BOINC) to help offload some of the pressure from your cloud?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Running 2 different domains
Hi guys, Something i'd like appreciate any opinion on... We own a local domain www.goodycard.co.nz but we're expanding overseas. I've purchased www.goodyhq.com as we'd like to drop the 'card' from URL. We can't get www.goodycard.com. The local site has a decent ranking, but obviously, the newly created HQ site doesn't. What's the best way to tie these in (or even ditch the .co.nz) but retain rank? And what's the best way for users to discover the HQ site? Any insight would help, as both sites are kind of different. One has a local member base and the other just sells software so it really depends on the region of search. Cheers!
Technical SEO | | r.moss0 -
PWA for Desktop Site (Ecommerce)
Hi Folks, Need guidance about using PWA on desktop site. As I know PWA is basically used for mobile site to engage visitor more and let them surf your site like an app. Would it be good SEO practice to use PWA on desktop site(E-commerce site) by calling everything through Javascript and let google Crawler cache only site logo and Hide everything else?
Technical SEO | | Rajesh.Prajapati1 -
GWT Fetch & Render displays desktop version of site as mobile
Hi team, I noticed that when I request a desktop rendering in GWT using fetch and render, pages render as the mobile version. Screenshot attached. It's related to the VHS units in our CSS (as far as I'm aware). Does anyone know what the implications of this may be? Does it mean googlebot can only see the mobile version of our website? Any help is appreciated. Jake jgScJ
Technical SEO | | Jacobsheehan0 -
Do you think have to re-submit my site to search engines after I made improvements?
Some time ago I started to do SEO for a one-page website and didn't get any positive result: no traffic, no filled in online booking form (yet another, multiple page website offering the same service yielded in multiple filled-in "schedule an appointment" forms). I found out my one-page website was considered to be "keyword-spamming" and converted it to a multiple page one. Its domain authority went up, but it doesn't still bring any traffic. I am thinking maybe I have to let the search engines know that it has been updated so they stop penalizing it? Do you think it might help and if yes, what exactly I should do? Will be thankful very much for any suggestion!
Technical SEO | | kirupa0 -
What domain name do you think is better for SEO: sirocco-webdesign.com or sirocco-web-design.com?
Hello, I would appreciate it very much if you share with me your thoughts on what domain name I'd better pick out in terms of productive SEO: sirocco-webdesign.com or sirocco-web-design.com? I know hyphens are not good, but second domain looks better, I think.
Technical SEO | | kirupa0 -
Pages with 301 redirects showing as 200 when crawled using RogerBot
Hi guys, I recently did an audit for a client and ran a crawl on the site using RogerBot. We quickly noticed that all but one page was showing as status code 200, but we knew that there were a lot of 301 redirects in place. When our developers checked it, they saw the pages as 301s, as did the Moz toolbar. If page A redirected to page B, our developers and the Moz toolbar saw page A as 301 and page B as 200. However the crawl showed both page A and page B as 200. Does anyone have any idea why the crawl may have been showing the status codes as 200? We've checked and the redirect is definitely in place for the user, but our worry is that there could be an issue with duplicate content if a crawler isn't picking up on the 301 redirect. Thanks!
Technical SEO | | Welford-Media0 -
I'm thinking I might need to canonicalize back to the home site and combine some content, what do you think?
I have a site that is mostly just podcasts with transcripts, and it has both audio and video versions of the podcasts. I also have a blog that I contribute to that links back to the video/transcript page of these podcasts. So this blog I contribute to has the exact same content (the podcast; both audio and video but no transcript) and then an audio and video version of this podcast. Each post of the podcast has different content on it that is technically unique but I'm not sure it's unique enough. So my question is, should I canonicalize the posts on this blog back to the original video/transcript page of the podcast and then combine the video with the audio posts. Thanks!
Technical SEO | | ThridHour0 -
Why is either Rogerbot or (if it is the case) Googlebots not recognizing keyword usage in my body text?
I have a client that does liposuction as one of their main services, they have been ranked in the top 1-5 for their keywords "sarasota liposuction" with different variations of the words for a long time, and suddenly have dropped about 10-12 places down to #15 in the engine. I went to investigate this and actually came to the "on-page analysis" tool for SEOmoz pro, where oddly enough it says that there is no mention of the target keyword in the body content (on-page analysis tool screenshot attached). I didn't quite understand why it would not recognize the obvious keywords in the body text so I went back to the page and inspected further. The keywords have an odd featured link that links up to an internally hosted keyword glossary for definitions of terms that people might not know directly. These definitions pop up in a lightbox upon clicking the keyword (liposuction lightbox screenshots attached). I have no idea why google would not recognize these words as they have the text in between the link, yet if there is something wrong with the code syntax etc. it might possibly hender the engine from seeing the body text of the link? any help would be greatly appreciated! Thank you so much! Phn2m Phn2m.png bWr5K.png V36CL.png
Technical SEO | | jbster130