What kind of data storage and processing is needed
-
Hi,
So after reading a few posts here I have realised it a big deal to crawl the web and index all the links.
For that I appreciate seomoz.org's efforts .
I was wondering what kind of infrastructure they might need to get this done ?
cheers,
Vishal
-
Thank you so much Kate for the explanation. It is quite helpful to better understand the process.
-
Hi vishalkhialani!
I thought I would answer your question with some detail that might satisfy your curiosity (although I know more detailed blog posts are in the works).
For Linkscape:
At the heart of our architecture is our own column oriented data store - much like Vertica, although far more specialized for our use case - particularly in terms of the optimizations around compression and speed.
Each month we crawl between 1-2 petabytes of data, strip out the parts we care about (links, page attributes, etc) and then compute a link graph of how all those sites link to one another (typically between 40-90 billion urls) and then calculate our metrics using those results. Once we have all of that we then precompute lots of views of the data, which is what gets displayed in Open SIte Explorer or retrieved via the Linkscape api. These resulting views of the data is over 12 terabytes (and this is all raw text compressed data - so it is a LOT of information). Making this fast and scalable is certainly a challenge.
For the crawling, we operate 10-20 boxes that crawl all the time.
For processing, we spin up between 40-60 instances to create the link graph, metrics and views.
And the API servers the index from S3 (Amazon's cloud storage) with 150-200 instances (but this was only 10 1 year ago, so we are seeing a lot of growth).All of this is Linux and C++ (with some python thrown in here and there).
For custom crawl:
We use similar crawling algorithms to Linkscape, only we keep the crawls per site, and also compute issues (like which pages are duplicates of one another). Then each of those crawls are processed and precomputed to be served quickly and easily within the web app (so calculating the aggregates and deltas you see in the overview sections).
We use S3 for archival of all old crawls. Cassandra for some of the details you see in detailed views, and a lot of the overviews and aggregates are served with the web app db.
Most of the code here is Ruby, except for the crawling and issue processing which is C++. All of it runs on Linux.
Hope that helps explain! Definitely let me know if you have more questions though!
Kate -
It is no where near that many. I attached an image of when I saw Rand moving the server to the new building. I think this may be the reason why there have been so many issues with the Linkscape crawl recently.
-
@keri and @Ryan
will ask them. my guess is around a thousand server instances.
-
Good answer from Ryan, and I caution that even then you may not get a direct answer. It might be similar to asking Google just how many servers they have. SEOmoz is fairly open with information, but that may be a bit beyond the scope of what they are willing to answer.
-
A question of this nature would probably be best as your one private question per month. That way you will be sure to receive a directly reply from a SEOmoz staff member. You could also try the help desk but it may be a stretch.
All I can say is it takes tremendous amounts of resources. Google does it very well, but we all know they have over 30 billion in revenue generated annually.
There are numerous crawl programs available, but the problem is the server hardware to run them.
I am only responding because I think your question may otherwise go unanswered and I wanted to point you in a direction where you can receive some info.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Data missing in the Attribution Model Comparison Tool
Hi, I analysed our conversion data in the model comparison tool. However, there seem to be o lot of data missing - overall conversion value and conversion value for every channel is much lower than shown in the regular acquisition report. Does anybody else experience the same problem? Thanks. Veronika
Reporting & Analytics | | healthpostnz0 -
Does the new Google Analytics Search Console Beta tool use API to pull more data?
So my client has been asking for definitive proof of why the search query data provided on Google Search Console does not exactly match up the data presented directly in the Search Console itself. The simple answer is that the Google Search Console is limited to 1000 rows of data. However our client is requesting a Google article/documentation of why the new Search Console beta tool has no row limit (hence much more data for a big website). I know that the Google Search Console API was available before Google announced the new Search Console Beta tool in Google Analytics. I also know this API could pull in more data than the 1000 row limit. However is there any article available (preferably from Google) that Google Analytics is pulling this Search Console data via API? Thanks!
Reporting & Analytics | | RosemaryB0 -
Google Analytics - Still Seeing Keyword Data
Hi, I hope you can answer this question for me, obviously I'm aware of the changes regarding "not provided" in analytics however I am still getting keyword referral data in analytics (not much of course!) in the section Acquisition>Keyword>Organic Can anybody explain why I am still seeing organic keyword data even up until yesterday for the odd term? I cannot find an answer anywhere!!! Many thanks
Reporting & Analytics | | splendidapple1 -
404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing as site is based on regular ticket selling (CRM), kindly help
Sites error increasing i.e. 404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing day by day as site is based on regular ticket selling (CRM), We have checked with webmasters for 404's, but it is not easy to delete 1.8 lac entries. How to resolve this issue for future. kindly help and suggest the solution.
Reporting & Analytics | | 1akal0 -
Changed URL's, traffic dropped from 2k week to 1K week. Need advice!
Hi Mozers, I recently changed my URLs for my ecommerce site and my traffic went from 2,000 visitors a week to 1,000 visitors a week, over a 3 week period. Traffic is down, so are unique Kwds. I need advice on why this happened and what I should do moving forward. To brief, I have a ecommerce website, www.ecustomfinishes.com. I noticed pattern that a lot of my URLs with a unique URL structure (URL.Com/ProductDescription/ProductName) were getting a lot of entrances ~30-50 a month, and others that followed the path of my subcategory (URL.com/SubCat/Product) were getting 0-3 entrances a month. The seo pattern was that those with unique product URLs were hitting long tail Kwds, and those URLs with /subcategory/product were getting far less traffic. I changed 150 or so urls to be unique. Good idea, I thought. Since then: CON: Since then my traffic dropped from 2200 visitors a week to 1100 visitors a week. -25% week to week, over 3 weeks CON: # of non-paid keywords sending visits: -25% week to week, over 3 weeks PRO: my Urls receiving entrances +10% week to week, over 3 weeks REF: http://imgur.com/GwZT8 Question: What are your best suggestions moving forward? Any advice is much appreciated, Thank you!!! abBN3
Reporting & Analytics | | longdenc_gmail.com0 -
We lost great amount of Google Traffic. Need Expert Advice, Please!
We are in the business of selling home and commercial light fixtures for about 10 years now. Our website is a very large ecommerce website with more than 40K pages including category, sub-category and product pages. We have been getting decent organic traffic mostly from highly competitive keywords and also from product/solution specific long tail keywords. Recently when Google changed the EMD algo we have seen a dip in the traffic (say about 60%). I can't be so sure that this is because of EMD update, but it started happening only after this update. There has been a rank drop from 1st page to 2nd page, decrease in no. of keywords driving traffic, decrease in no. of pages driving traffic, all these things have an negative impact on our organic revenue. I know that our back link portfolio is bad and the reason behind this is the SEO companies that we previously worked with, Thanks to them for this sloppy work. Other than back links, Is there anything fundamentally wrong on our website. Here is the URL http://bit.ly/QVFHgr
Reporting & Analytics | | goldenageusa0 -
Do I need to turn off custom results or empty my cache before using the SEOMOZ Pro Rank checker?
I'm getting unusual swings in SERPS ranks from one day to another (I had several keywords ranked in top ten go to the the third or fourth page in a week). I knwo that there's always variation because of localized and custom search, but this seems to be something more.
Reporting & Analytics | | bbelgard0 -
Sessions in GA - need clarifications
Hello, Regarding recent change in how Google now defines a session... If a user visits a site from a different traffic source (say Google and Bing ) within 30 minutes, would that show 2 visits now ? Was earlier it considered 1 visit ? Also, sessions ends "at the end of the day" It can be different for different users ? How does Google defines end of the day ? for USA time ? Thanks
Reporting & Analytics | | seoug_20050