What kind of data storage and processing is needed
-
Hi,
So after reading a few posts here I have realised it a big deal to crawl the web and index all the links.
For that I appreciate seomoz.org's efforts .
I was wondering what kind of infrastructure they might need to get this done ?
cheers,
Vishal
-
Thank you so much Kate for the explanation. It is quite helpful to better understand the process.
-
Hi vishalkhialani!
I thought I would answer your question with some detail that might satisfy your curiosity (although I know more detailed blog posts are in the works).
For Linkscape:
At the heart of our architecture is our own column oriented data store - much like Vertica, although far more specialized for our use case - particularly in terms of the optimizations around compression and speed.
Each month we crawl between 1-2 petabytes of data, strip out the parts we care about (links, page attributes, etc) and then compute a link graph of how all those sites link to one another (typically between 40-90 billion urls) and then calculate our metrics using those results. Once we have all of that we then precompute lots of views of the data, which is what gets displayed in Open SIte Explorer or retrieved via the Linkscape api. These resulting views of the data is over 12 terabytes (and this is all raw text compressed data - so it is a LOT of information). Making this fast and scalable is certainly a challenge.
For the crawling, we operate 10-20 boxes that crawl all the time.
For processing, we spin up between 40-60 instances to create the link graph, metrics and views.
And the API servers the index from S3 (Amazon's cloud storage) with 150-200 instances (but this was only 10 1 year ago, so we are seeing a lot of growth).All of this is Linux and C++ (with some python thrown in here and there).
For custom crawl:
We use similar crawling algorithms to Linkscape, only we keep the crawls per site, and also compute issues (like which pages are duplicates of one another). Then each of those crawls are processed and precomputed to be served quickly and easily within the web app (so calculating the aggregates and deltas you see in the overview sections).
We use S3 for archival of all old crawls. Cassandra for some of the details you see in detailed views, and a lot of the overviews and aggregates are served with the web app db.
Most of the code here is Ruby, except for the crawling and issue processing which is C++. All of it runs on Linux.
Hope that helps explain! Definitely let me know if you have more questions though!
Kate -
It is no where near that many. I attached an image of when I saw Rand moving the server to the new building. I think this may be the reason why there have been so many issues with the Linkscape crawl recently.
-
@keri and @Ryan
will ask them. my guess is around a thousand server instances.
-
Good answer from Ryan, and I caution that even then you may not get a direct answer. It might be similar to asking Google just how many servers they have. SEOmoz is fairly open with information, but that may be a bit beyond the scope of what they are willing to answer.
-
A question of this nature would probably be best as your one private question per month. That way you will be sure to receive a directly reply from a SEOmoz staff member. You could also try the help desk but it may be a stretch.
All I can say is it takes tremendous amounts of resources. Google does it very well, but we all know they have over 30 billion in revenue generated annually.
There are numerous crawl programs available, but the problem is the server hardware to run them.
I am only responding because I think your question may otherwise go unanswered and I wanted to point you in a direction where you can receive some info.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help needed - traffic reduced by half for no apparent reason - not sure what to do
My client site, https://www.helpinhearing.co.uk/ - has regularly been showing traffic of between 1000 & 1500 sessions per month. I've just looked at their Analytics for September and the sessions have dropped to just over 600! And not only was there a period of around 11 days from September 19th onwards when there was either zero traffic or 1 or 2 visits, but also from October 4th to today (8th Oct) also no visits at all. This has never happened and we've managed this website for several years now. I cannot fathom out why this may be the case. We haven't changed anything on the site from a technical point of view, just added content and usual blog etc pages. Search Console lists 147 pages as 404 errors but there are no urgent messages or alerts/warnings. I really don't know how to proceed and try and find out what is going on with the site. Can anyone offer suggestions?
Reporting & Analytics | | mfrgolfgti0 -
Search console Search Analytics devices not showing mobile and tablet data since July 29th, have anyone noticed that too?
If you filter for devices in the search analytics at search console you get that from July 29th all the data is tagged as desktop and mobile and tablet have no data from that date. I see that for all my websites I have search console for, any input on that?
Reporting & Analytics | | amirbt0 -
Does applying filters in Google Analytics affect Moz Data?
Hello, I'm dealing with spam in google analytics, applying filters (and using segments for historical data) Does applying a filter in GA affect Moz data, future or historical? Is there a way of doing this? Thank you!
Reporting & Analytics | | wearehappymedia0 -
Something does not add up with WMTs search analytics data
we recently replatformed our main site and switched to https. For the first 2-3 weeks after we moved organic traffic was great, we did not lose any ( increased a little), but then it dropped off significantly. Attached is a screenshot from one of our main keywords that dropped off. You can see click (blue) and impressions (red) dropped off, and the position became unstable, but in the last week it has stabilised to about the same position it was before, but the clicking and impressions are still very low. The keyword is generic (for our industry) and there would not be any major seasonal changes in the search volume. I can't make sense of this data, could be be wrong? Kd3p5f9.jpg
Reporting & Analytics | | PaddyDisplays1 -
Organic traffic vs. GWT data
Hi, how are you? I'm having a question becasue of an inconsistency between the data GWT gives and the one GA gives me. When I see the ammount of clics GWT tells me in february, it says 32850. When I go to Channels --> Organic Search, it says 51014. The difference is really big! Do you happen to know why this huge gap between data?
Reporting & Analytics | | arielbortz0 -
Keeping Google Analytics Data when Moving to Subdomain
Hey All, Against my objections a client has decided to move an existing site into a subdomain while putting up a new site on the main domain. My question revolves around Google Analytics, how do I make sure that I don't lose historical data on the domain before it moves to a subdomain? We're going to be doing a redesign of the old site and I need to keep the historical data so I can prioritize content. What do I need to do? Or will Google analytics recognize the URL's and still attribute the data to those URL's or will I have a separate set of data based on the new URL (with the subdomain). Any insight would be appreciated! Thanks!
Reporting & Analytics | | EvansHunt0 -
Moving data between Google Analytics Properties
Last summer we setup another Google Analytics property for us with Universal Analytics and have been running this alongside the old Google Analtyics property. is there a way of exporting all the old data from the old property into the new Universal Analytics property?
Reporting & Analytics | | ese0 -
Newsletter Campaign Need HELP to Create a Custom Report in Google Analytics
I have this newsletter send using Mailchimp. This campaign is link to G.A. How can I create custom report for me online store about this campaign? For example: I have 2 Goals Completion Location setup in G.A. they are: /checkout.php and /finishorder.php Is there a way to find out how many visitors from my campaign reach the /finishorder.php vs. /checkout.php Reason, about 50% are reaching /finishorder.php Maybe I need to creat a "How to redeem you coupon code" video to included in the newsletter to HELP customers complet there order process. Thank you, BigBlaze
Reporting & Analytics | | BigBlaze2050