Thinking aloud - what if WE could run rogerbot from our desktops?
-
Total, total noob question, I know - but is rogerbot performance bound because of bandwidth and processing capacity? I understand if it is, but I am wondering for those of us with very large sites if we would be able to offload the burden on SEOmoz resources by running our own local licensed version of rogerbot, crawl the sites we want and the upload the data to SEOmoz for analysis.
If this was possible would we be getting more immediate results?
-
On the topic of a private crawl (or distributed crawl), these are cool ideas, but not something we currently have in our plans. Having the crawl centralized allows us to store historic data and ensure polite crawling. This may take a little extra time (we are indeed doing a lot of crawls, as well as processing them and retrieving link data for each of them), but we are actively working on on our infrastructure to reduce our crawling and processing time.
While the first crawl does take a number of days, subsequent crawls are started on the same day each week, and should take roughly the same amount of time to complete, controlling for external factors. So in general you should have fresh crawl data right around weekly, give or take a day or two.
As for your specific crawls, I'd be happy to look into them for you. I'll send you a separate email to discuss.
-
Still waiting for my custom crawl launched on April 28th (it's May 2nd) to complete... seriously? If SEOmoz is overwhelmed, first of all, congratualations on being so popular, but I am not getting any timely data at all.
-
Or, instead of running Rogerbot locally, we could run some distributed processing (ie. BOINC) to help offload some of the pressure from your cloud?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GWT Fetch & Render displays desktop version of site as mobile
Hi team, I noticed that when I request a desktop rendering in GWT using fetch and render, pages render as the mobile version. Screenshot attached. It's related to the VHS units in our CSS (as far as I'm aware). Does anyone know what the implications of this may be? Does it mean googlebot can only see the mobile version of our website? Any help is appreciated. Jake jgScJ
Technical SEO | | Jacobsheehan0 -
Content on desktop and mobile
My website hasn't using responsive design and separate domain for mobile optimize, But we using dynamic serving for mobile version, Read more about dynamic serving here So our website must different design for both version, And then what would be happen in term of SEO if our website hasn't show the same content as desktop but still align with the main content, Such as Desktop has longer content compare to mobile version or Desktop has long H1 but mobile is shorter than. What should we do for this case and how to tell Google Bot.
Technical SEO | | ASKHANUMANTHAILAND0 -
Do mobile and desktop sites that pull content from the same source count as duplicate content?
We are about to launch a mobile site that pulls content from the same CMS, including metadata. They both have different top-level domains, however (www.abcd.com and www.m.abcd.com). How will this affect us in terms of search engine ranking?
Technical SEO | | ovenbird0 -
Do you think my client is being hit for duplicate content?
Wordpress website. The client's website is http://www.denenapoints.com/ The URL that we purchase so that we could setup the hosting account is http://houston-injury-lawyers.com, which shows 1 page indexed in Google when I search for site:http://houston-injury-lawyers.com On http://www.denenapoints.com/ there is <link rel="<a class="attribute-value">canonical</a>" href="http://houston-injury-lawyers.com/"> But on http://houston-injury-lawyers.com it says the same thing, <link rel="<a class="attribute-value">canonical</a>" href="http://houston-injury-lawyers.com/" /> Is this how it should be setup, assuming that we want everything to point to http://denenapoints.com/? Maybe we should do a 301 redirect to be 100% Sure? Hopefully I explained this well enough. Please let me know if anyone has any thoughts, thanks!
Technical SEO | | georgetsn0 -
I think I got hit by the latest Panda update
Hi everyone, I think one of my sites got hit with Panda. On Sept 18th the site dipped to "not in top 50" for almost all keywords. I checked GWT for the manual action email but my inbox is empty!!!!!!!!!! The lesser of 2 evils I guess. They had major server issues that week as well so it is hard to identify what caused the site to dip. My client has original content on the website but almost all content on the blog is copied. Do you recommend me deleting the non original content? Can the problem be elsewhere? Thanks
Technical SEO | | Carla_Dawson0 -
23,000 pages indexed, I think bad
Thank you Thank you Moz People!! I have a successful vacation rental company that has terrible seo but getting better. When I first ran Moz crawler and page grader, I had 35,000 errors and all f's.... tons of problem with duplicate page content and titles because not being consistent with page names... mainly capitalization and also rel canonical errors... with that said, I have now maybe 2 or 3 errors from time to time, but I fix every other day. Problem Maybe My site map shows in Google Webmaster submitted 1155
Technical SEO | | nickcargill
1541 indexed But google crawl shows 23,000 pages probably because of duplicate errors or possibly database driven url parameters... How bad is this and how do I get this to be accurate, I have seen google remove tool but I do not think this is right? 2) I have hired a full time content writer and I hope this works My site in google was just domain.com but I had put a 301 in to www.domain.com becauses www. had a page authority where the domain.com did not. But in webmasters I had domain.com just listed. So I changed that to www.domain.com (as preferred domain name) and ask for the first time to crawl. www.domain.com . Anybody see any problems with this? THank you MOZ people, Nick0 -
I'm thinking I might need to canonicalize back to the home site and combine some content, what do you think?
I have a site that is mostly just podcasts with transcripts, and it has both audio and video versions of the podcasts. I also have a blog that I contribute to that links back to the video/transcript page of these podcasts. So this blog I contribute to has the exact same content (the podcast; both audio and video but no transcript) and then an audio and video version of this podcast. Each post of the podcast has different content on it that is technically unique but I'm not sure it's unique enough. So my question is, should I canonicalize the posts on this blog back to the original video/transcript page of the podcast and then combine the video with the audio posts. Thanks!
Technical SEO | | ThridHour0 -
What do you think of this reconsideration request?
Just about to send a reconsideration request to Google for my site: seoco.co.uk and would like your input. I was going to include information about each URL I found and the steps I have taken but there is not room. What do you think of this: “Hi guys, i got an unnatural links message from you back in February and since then my website rankings have fallen dramatically. I spoke to someone at SEOmoz and they said that my website probably got penalised for directory links so I have gone out and tried to get rid of all the low quality ones that I am responsible for and some that I am not. Altogether I was able to identify about 218 low quality directory links. I attempted to contact every one of the directory owners twice over a two week period and I was able to get about 68 removed. I have used the disavow tool to devalue the rest. Trying to get rid of all of those bad links was hard work and I have definitely learned my lesson. Rest assured I will not be submitting to anymore directories in the future. Please can you give me another chance? If my site still violates the guidelines please could you point out some of the bad links that are still there?” What do you think? Can you think of anything else I should say? Dave
Technical SEO | | Eavesy0