What kind of data storage and processing is needed
-
Hi,
So after reading a few posts here I have realised it a big deal to crawl the web and index all the links.
For that I appreciate seomoz.org's efforts .
I was wondering what kind of infrastructure they might need to get this done ?
cheers,
Vishal
-
Thank you so much Kate for the explanation. It is quite helpful to better understand the process.
-
Hi vishalkhialani!
I thought I would answer your question with some detail that might satisfy your curiosity (although I know more detailed blog posts are in the works).
For Linkscape:
At the heart of our architecture is our own column oriented data store - much like Vertica, although far more specialized for our use case - particularly in terms of the optimizations around compression and speed.
Each month we crawl between 1-2 petabytes of data, strip out the parts we care about (links, page attributes, etc) and then compute a link graph of how all those sites link to one another (typically between 40-90 billion urls) and then calculate our metrics using those results. Once we have all of that we then precompute lots of views of the data, which is what gets displayed in Open SIte Explorer or retrieved via the Linkscape api. These resulting views of the data is over 12 terabytes (and this is all raw text compressed data - so it is a LOT of information). Making this fast and scalable is certainly a challenge.
For the crawling, we operate 10-20 boxes that crawl all the time.
For processing, we spin up between 40-60 instances to create the link graph, metrics and views.
And the API servers the index from S3 (Amazon's cloud storage) with 150-200 instances (but this was only 10 1 year ago, so we are seeing a lot of growth).All of this is Linux and C++ (with some python thrown in here and there).
For custom crawl:
We use similar crawling algorithms to Linkscape, only we keep the crawls per site, and also compute issues (like which pages are duplicates of one another). Then each of those crawls are processed and precomputed to be served quickly and easily within the web app (so calculating the aggregates and deltas you see in the overview sections).
We use S3 for archival of all old crawls. Cassandra for some of the details you see in detailed views, and a lot of the overviews and aggregates are served with the web app db.
Most of the code here is Ruby, except for the crawling and issue processing which is C++. All of it runs on Linux.
Hope that helps explain! Definitely let me know if you have more questions though!
Kate -
It is no where near that many. I attached an image of when I saw Rand moving the server to the new building. I think this may be the reason why there have been so many issues with the Linkscape crawl recently.
-
@keri and @Ryan
will ask them. my guess is around a thousand server instances.
-
Good answer from Ryan, and I caution that even then you may not get a direct answer. It might be similar to asking Google just how many servers they have. SEOmoz is fairly open with information, but that may be a bit beyond the scope of what they are willing to answer.
-
A question of this nature would probably be best as your one private question per month. That way you will be sure to receive a directly reply from a SEOmoz staff member. You could also try the help desk but it may be a stretch.
All I can say is it takes tremendous amounts of resources. Google does it very well, but we all know they have over 30 billion in revenue generated annually.
There are numerous crawl programs available, but the problem is the server hardware to run them.
I am only responding because I think your question may otherwise go unanswered and I wanted to point you in a direction where you can receive some info.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need your Opinion on Bounce Rate Analysis
I'm currently doing a bounce rate analysis for our resource pages. These are information article pages - mix of plain texts and those containing either images, infographics, videos or even podcasts. By the way, I did search for bounce rate topics here, but I felt like i still need to post this. Unless I've overlooked a similar post, my apologies. It's a first for me to do an in-depth BR analysis, so I need to clarify few things. What is a good or bad range bounce rate? Is there even a range comparison? Like when can you say a bounce rate is high for an information type page? I've read some stuff online but they're confusing. What other Analytics factors should I consider looking at together with bounce rate? For pages (which purposely educate visitors) with high bounce rate, can you guys suggest tips to improve it? I would appreciate and value any advise. Thanks a lot!
Reporting & Analytics | | ktrich1 -
What's more accurate? GA queries data or Moz/SEMRush keyword data for rankings
What do you guys think? What's more accurate? GA queries data or Moz/SEMRush keyword data for rankings? Any thoughts appreciated.
Reporting & Analytics | | znotes0 -
Need advice on setting up primary domain and shopify site analytics to work best together
Hello, I have a client that I have been working on their primary site for the last year or so. In the last month they decided to have one of their internal employees setup a small shopify store. Now they are asking for the analytics tracking codes for it. My question for you is what would be the best way for me to set that up? variables: primary domain and shopify domain, google and bing analytics Have been looking at how cross domain tracking works (https://support.google.com/tagmanager/answer/6106951), and the instructions for setting up ecommerce in analytics for shopify (https://help.shopify.com/manual/reports-and-analytics/google-analytics/google-analytics-setup). But am still not 100% which route would be the best, any input would be greatly appreciated! thank you, Dustin
Reporting & Analytics | | pastedtoast1 -
How long does Google Analytics store data?
Hello All, How long analytics keep the data for one website? at least two years at least 25 months other I guess that they guarantee at least 25 months, but it might be more.
Reporting & Analytics | | CommercePundit
Anyone has any other suggestion? Thanks,0 -
Transferring of analytic data
Hey SEOMoz community, Question, we just purchased a business and while we didn't keep their website content we acquired their URLs and will be rebuilding the site. We've asked for and been denied any historical analytic data e.g. they wont transfer admin rights over to us. Is there anyway to access historical data without being made an admin? or do we start from ground zero. One of the reasons we're being given for not being allowed access is that "Google analytics and the associated keywords are the vendor's intellectual property" - given that we brought the brand doesn't that IP transfer over to us anyways? Thanks, PC
Reporting & Analytics | | PC-QSG
(Long time forum stalker, first time poster)0 -
No data available for example.com in WMT. What to do?
Hi, Our problem is simple: we have statistics data for www.example.com but some data is missing for example.com (eg."links to your site", "structured data, "html improvements") . However, "search queries", "index status" and some other data is available for example.com. The problem is that we have over 5000 subdomains and we see no information about them.(especially links pointing to them). We followed every advice given by Google but doesn't seem to work: -Adding www.example.com and example.com in WMT -Setting www.example.com as the preffered domain -Using DNS verification to verify our site What do we have to do? Thank you, Axello
Reporting & Analytics | | axello0 -
Transfer Google Analytics data from one user to another?
Long story short, two weeks of analytics data are stored on a different profile from the rest of the year's data. Is there a way to export all data from a date range, and then import it to another account? Thanks for your help.
Reporting & Analytics | | AmericanOutlets0 -
Analytics giving crazy impossible data?
When I look at my Analytics using any of my segments, they don't work. It shows zero visits for the segment until April 30th, then the visitors for the segment shoots up to above the number for all visits! Anyone else experiencing this bizarre data?
Reporting & Analytics | | mascotmike0