A suggestion to help with linkscape crawling and data processing
-
Since you guys are understandably struggling with crawling and processing the sheer number of URLs and links, I came up with this idea:
In a similar way to how SETI@Home (is that still a thing? Google says yes: http://setiathome.ssl.berkeley.edu/) works, could SEOmoz use distributed computing amongst SEO moz users to help with the data processing? Would people be happy to offer up their idle processor time and (optionally) internet connections to get more accurate, broader data?
Are there enough users of the data to make distributed computing worthwhile?
Perhaps those who crunched the most data each month could receive moz points or a free month of Pro.
I have submitted this as a suggestion here:
http://seomoz.zendesk.com/entries/20458998-crowd-source-linkscape-data-processing-and-crawling-in-a-similar-way-to-seti-home -
Sean - I share Rand' sentiments, thanks so much for the suggestion!
We have considered distributed crawling in the past (or even distributed rank checking because then it would be in that user's locale) but there are a whole different set of challenges. For example, you have to handle all the edge cases: what if a user's computer isn't on, or loses connectivity, what if we crawl too fast and the user gets blocked from a site, how do you write all that data securely?
Of course all of these concerns can be overcome, but right now we feel like we have a good handle on the problems, and it will be much faster for us to just fix what we have
Although, I know all of us are so appreciative of the ideas and support, and we will have something really great soon!
-
Thanks a ton Sean! We have considered distributed computing as a way to help crawl, index, process, etc. It's so flattering and humbling to hear that you'd be willing to help out and that the community would, too
For now, we believe we can get to the index size/quality/freshness using our hosted system, but the engineering team will certainly be encouraged to hear that folks in our community might contribute to this. Distributed systems present their own challenges, and we'd have to write that code from scratch, but if we find that we can't do what we want with our existing network, we might reach out.
BTW - I wanted to let folks know that the team here does feel very confident that come December/January, we're going to be producing indices that reach exceptional quality bars. The problems we face are largely known, and we now have the team and the solutions to tackle it, so we're pretty excited.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Crawl 4xx Errors?
Hello! When I check our website's critical crawler issues with Moz Site Crawler, I'm seeing over 1000 pages with a 4xx error. All of the pages that are showing to have a 4xx error appear to be the brand and product pages we have on our website, but with /URL at the end of each permalink. For example, we have a page on our site for a brand called Davinci. The URL is https://kannakart.com/davinci/. In the site crawler, I'm seeing the 4xx for this URL: https://kannakart.com/davinci/URL. Could this be a plugin on our site that is generating these URLs? If they're going to be an issue, I'd like to remove them. However, I'm not sure exactly where to begin. Thanks in advance for the help, -Andrew
Moz Pro | | mostcg0 -
Lag time between MOZ crawl and report notification?
I did a lot of work to one of my sites last week and eagerly awaited this week's MOZ report to confirm that I had achieved what I was trying to do, but alas I still see the same errors and warnings in the latest report. This was supposedly generated five days AFTER I made the changes, so why are they not apparent in the new report? I am mainly referring to missing metadata, long page titles, duplicate content and duplicate title errors (due to crawl and URL issues). Why would the new crawl not have picked up that these have been corrected? Does it rely on some other crawl having updated (e.g. Google or Bing)?
Moz Pro | | Gavin.Atkinson0 -
Problems with Crawl Diagnostics/New Campaign
Hi all I added a new domain to my campaigns yesterday and got the usual message saying we will have a small set of results ready in a couple of hours, and the rest will be done in a week or so. 24 hours later, this same message is still showing. Anyone else experiencing this or is it just related to my domain? Many thanks. Carl
Moz Pro | | GrumpyCarl0 -
Crawl diagnostic Notices for rel Canonical increased
Hello, We just signed up for SEO Moz, and are reviewing the results of our second web crawl. Our Errors and Warnings summary have been reduced, but our Notices for Rel Canonical have skyrocketed from 300 to over 5,500. We are using a WP with the Headway theme and our pages already have the rel=canonical along wiht rel=author. Any ideas why this number would go up so much in one week? Thank you, Michael
Moz Pro | | MKaloud0 -
This Rookie needs help! Duplicate content pages dropped significantly.
So I am pretty new to SEO Moz. I have an e-commerce site and recently did a website redesign. However, not without several mistakes and issues. That said, when SEO Moz did a crawl of my site, the results showed A LOT of Duplicate Content Pages on my site due to my having one item in many variations. It was almost over whelming and because the number of pages was so high, I have been trying to research ways to correct it quickly. The latest crawl from yesterday shows a drastic drop in the number of duplicate content pages and a slight increase in pages with too long page titles (which is fixable). I am embarrassed to give the number of duplicate pages that were showing but, just know, it's been reduced to a third of the amount. I am just wondering if I missed something and should I be happy or concerned? Has there been a change that could have caused this? Thanks for helping this rookie out!
Moz Pro | | AvenueSeo0 -
Using the 'Organic Traffic Data ' feature.
I don't have access to the clients Google Analytics Password. Can you set it up without being redirected to Google webmasters? Is there a way to insert a simple code into the website and then have SEOmoz double check the code? Thanks so much
Moz Pro | | Robin_Jennings0 -
MOZ Crawler only crawling one page per campaign
We set up some new campaigns, and now for the last two weekly crawls, the crawler is only accessing one page per campaign. Any ideas why this is happening? PS - two weeks back we did "upgrade" the account. Could this have been an issue?
Moz Pro | | AllaO0 -
Why does my crawl report show just one page result?
I just ran a crawl report on my site: http://dozoco.com The result report shows results for just one page - the home page, but no other pages. The report doesn't indicate any errors or "do not follows" so I'm unclear on the issue, although I suspect user error - mine.
Moz Pro | | b1lyon0