Siloing Architecture with Large Inventories
-
Hi!
Question...
How do you maintain silo site architecture while working with very large inventories/huge selection of products? I am trying to keep things somewhat close to the root domain name but with all the products I find myself building sub-folders sometimes 4 off the root. For instance, I am working on a motorcycle niche and my structure planning is coming out a little like this...
www.motorcyclesite.com/honda/cbr1000rr/2004-2005/product1.html
This is as far as it goes, but I've always been under the opinion that the further google has to crawl to finally get the product is a major no-no. Keep it organized, siloed, and as close to the root as possible. (ie...motorcyclesite.com/honda/product1.html)
Any suggestion for larger inventories like my own?
Thanks for all that you do!
Mike
-
If you have links to 100 category pages on your homepage and each of those category pages links to links to 100 subcategory pages, each linking to 100 products then that should give you 1 million products, none deeper than three clicks from home.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google & Site Architecture
Hi I've been reading the following article about Google's quality signals here: https://searchenginewatch.com/2016/10/10/guide-to-google-ranking-signals-part-6-trust-authority-and-expertise/?utm_source=Search+Engine+Watch&utm_campaign=464594db7c-11_10_2016_NL&utm_medium=email&utm_term=0_e118661359-464594db7c-17828341 They mention - 3) All your categories should be accessible from the main menu. All your web pages should be labelled with the relevant categories. Is this every category? We have some say 3 levels deep, and they aren't all in the menu. I'd like them to be, so would be good to make a case for it. Thank you
Algorithm Updates | | BeckyKey1 -
Quickest way to deindex a large number of pages
Our site was recently hacked by spammers posting fake content and bringing down our servers, etc. After a few months, we finally figured out what was going on and fixed the issue. However, it turns out that Google has indexed 26K+ spammy pages and we've lost page rank and search engine rankings as a result. What is the best and fastest way to get these pages out of Google's index?
Algorithm Updates | | powpowteam0 -
Is it better to build a large site that covers many verticals or many sites dedicated to each vertical
Just wondering from an seo perspective is it better to build a large site that covers many verticals or build out many sites one for each vertical?
Algorithm Updates | | tlhseo0 -
Google Panda - large domain benefits
Hi, A bit of a general question, but has anyone noticed a improvement in rankings for large domains - ie well known, large sites such as Tesco, Amazon? From what I've seen, the latest Panda update seems to favour the larger sites, as opposed to smaller, niche sites. Just wondered if anyone else has noticed this too?Thanks
Algorithm Updates | | Digirank0 -
Large number of thin content pages indexed, affect overall site performance?
Hello Community, Question on negative impact of many virtually identical calendar pages indexed. We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000. Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000. When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report. So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well. Thanks for your feedback. Chris
Algorithm Updates | | cogbox0 -
Long term plan for a large htaccess file with 301 redirects
We setup a pretty large htaccess file in February for a site that involved over 2,000 lines of 301 redirects from old product url's to new ones. The 'old urls' still get a lot of traffic from product review sites and other pretty good sites which we can't change. We are now trying to reduce the page load times and we're ticking all of the boxes apart from the size of the htaccess file which seems to be causing a considerable hang on load times. The file is currently 410kb big! My question is, what should I do in terms of a long terms strategy and has anyone came across a similar problem? At the moment I am inclined to now remove the 2,000 lines of individual redirects and put in a 'catch all' whereby anything from the old site will go to the new site homepage. Example code: RedirectMatch 301 /acatalog/Manbi_Womens_Ear_Muffs.html /manbi-ear-muffs.html
Algorithm Updates | | gavinhoman
RedirectMatch 301 /acatalog/Manbi_Wrist_Guards.html /manbi-wrist-guards.html There is no consistency between the old urls and the new ones apart from they all sit in the subfolder /acatalog/0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Are the latest Ranking Reports counting the new large format site links as positions?
Received my weekly ranking report this morning and noticed a specific keyword that I've been ranking in the 3rd or 4th spot has dropped a significant amount of positions. I tested the results myself and it appears the site links of the manufacturer are being counted as positions? My keyword has me in the 3rd position (although it is much lower on the physical page now because of the new format). I'm really wondering how this will affect organic listings going forward - this new format could be a game changer.
Algorithm Updates | | longbeachjamie2