Is Siloing still effective in 2018?
-
I've been advised about Siloing (site structure), although I'm getting conflicting advice now saying it is an outdated practice. What is the 2018 verdict?
-
So by siloing content you mean creating categories and subcategories like in a library. So Reference, fiction, magazines etc. And then having subsections. The old way to 'rank' for something was to pick up keywords for the smaller sections and subsections.
So to use the example of dentistry. I might have adult dentistry and paediatric dentistry pages then in adults i'll have implants, braces and veneers. Then in veneers I might have an article about veneers price, veneers procedure and veneers risks. All of these would link to each other and link up in the architecture. And hey-presto, i'd eventually rank for one of the 'top categories' like adult dentistry.
The problem with this is that it's going to create internal competition and conflict. Google doesn't want users having to hop around highly granular subtopics for answers. They'd rather have the answer to a query all in the same place. So instead I'll now have one single page with everything people need to know about veneers; price, risks, procedure etc. All in one place.
Now there are further difficulties because google will sometimes consider two related things as different 'topics' or answering different questions. So I do have page for everything about veneers and also a page about veneers cost. In the case of veneers everyone wants to know the cost. It's all cost cost cost - so this is it's own topic and it's own page. But for something like root canals, nobody cares how much they cost, they just want to get out of pain. So the root canal cost section is on the main root canal page because it's included in the topic of 'root canals'.
It's now more about searcher intent https://moz.com/blog/how-google-gives-us-insight-into-searcher-intent-through-the-results-whiteboard-friday, possibly 'searcher task accomplishment' https://moz.com/blog/harnessing-link-equity and also how link equity flows: https://moz.com/blog/harnessing-link-equity
Also read this: https://www.cs.cornell.edu/home/kleinber/pcm.pdf it's tough going but just ignore what you don't understand and press on with reading it all and you'll learn a great amount about how google functions.
So to answer the question, you still need a solid site structure but i'd say 'siloing' is possibly going to dilute the potential power of each page. You're going to end up with 30 pages all about sub-sub topics that should be rethought out and consolidated using google as your research tool. Always use google as your research tool. To do anything else is like training for a sprint race by going swimming every day.
'Siloing' for me also created a ton of duplicate content, duplicate headlines and I even think I got stung by Maccabees for having some pages about all the different aspects of implant dentistry. They are now all consolidated into a 'super page' and it's ranking #1 locally and really well nationally too. Page one.
Imagine five pages, H1's are 'braces cost', 'braces procedure', 'braces on finance' and 'braces risks'. Google is going to struggle - in my view - to rank me for any of those because they all have an H1 containing the word braces. What would be better would be to have a'braces' page and then the H2's were all those sub sub topics and then an FAQ with all the google suggest words as H2 and then all the 'searchers also asked' words in the FAQ.
Hope this helps - this is my interpretation from my small local business here in the UK. So other users here may have more relevant information. For example IA, cannibalisation, internal conflict etc is much bigger in shopping and information businesses than it is in services businesses.
And of course this classic: https://moz.com/blog/optimizing-for-rankbrain-whiteboard-friday thanks to @miriam ellis for that one.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Silo Structure in the eye of google?
does silo structure has a positive point on Google Ranking or not, and what is the importance of internal linking, how google see the internal linking content as compared to less internal linking, I'm trying an experiment I do a lot of internal backlinking in Website Unionwell as compared to Website B (which has apparently less internal Links) so with your experience in SEO field which site will get traffic rapidly.
Intermediate & Advanced SEO | | saimkhanna0 -
Silo not ranking for main silo page - what can I do?
Hi everyone, I set up a silo for my page http://werkzeug-kasten.com/ . Unfortunately only the silos inner pages rank very good. These are for example http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/keyword-analyse/ for "Keywordanalyse SEO Freiburg" <a>http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/onpage-seo/</a> for "Onpage SEO Freiburg" ... but the silos main page <a>http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/</a> does not rank for "SEO Freiburg". Do you have any idea why that might be? Cheers, Marc
Intermediate & Advanced SEO | | RWW0 -
Pages with rel "next"/"prev" still crawling as duplicate?
Howdy! I have a site that is crawling as "duplicate content pages" that is really just pagination. The rel next/prev is in place and done correctly but Roger Bot and Google are both showing duplicated content + duplicate page titles & meta's respectively. The only thing I can think of is we have a canonical pointing back at the URL you are on - we do not have a view all option right now and would not feel comfortable recommending it given the speed implications and size of their catalog. Any experience, recommendations here? Something to be worried about? /collections/all?page=15"/>
Intermediate & Advanced SEO | | paul-bold0 -
Effect Of Restoring Old Website After Implementing 301 Redirects
After redesigning my old Drupal website and launching a new "improved" Wordpress version the new version is performing badly. Ranking is poor and conversions don't occur. I realize that my new design is bad (no call to action, poor structure, text heavy). New business inquiries have ceased. The site contains 450 pages. After spending $25,000 and a year of my life I see the new version is not an improvement! What would be the effect of reinstating the old version of the site and doing 301 redirects back to it? Would the old rankings be restored? I need to decide whether I should revert or focus on fixing flaws in the improved design. Any thoughts?? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
If I only Link to Page via Sitemap, can it still get indexed?
Hi there! I am creating a ton of content for specific geographies. Is it possible for these pages to get indexed if I only put them in my sitemap and don't link to them through my actual site (though the pages will be live). Thanks!
Intermediate & Advanced SEO | | Travis-W
Travis0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Effect duration of robots.txt file.
in my web site there is demo site in that also, index in Google but no need it now.so i have created robots file and upload to server yesterday.in the demo folder there are some html files,and i wanna remove all these in demo file from Google.but still in web master tools it showing User-agent: *
Intermediate & Advanced SEO | | innofidelity
Disallow: /demo/ How long this will take to remove from Google ? And are there any alternative way doing that ?0 -
Why is noindex more effective than robots.txt?
In this post, http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo, it mentions that the noindex tag is more effective than using robots.txt for keeping URLs out of the index. Why is this?
Intermediate & Advanced SEO | | nicole.healthline0