Is Siloing still effective in 2018?
-
I've been advised about Siloing (site structure), although I'm getting conflicting advice now saying it is an outdated practice. What is the 2018 verdict?
-
So by siloing content you mean creating categories and subcategories like in a library. So Reference, fiction, magazines etc. And then having subsections. The old way to 'rank' for something was to pick up keywords for the smaller sections and subsections.
So to use the example of dentistry. I might have adult dentistry and paediatric dentistry pages then in adults i'll have implants, braces and veneers. Then in veneers I might have an article about veneers price, veneers procedure and veneers risks. All of these would link to each other and link up in the architecture. And hey-presto, i'd eventually rank for one of the 'top categories' like adult dentistry.
The problem with this is that it's going to create internal competition and conflict. Google doesn't want users having to hop around highly granular subtopics for answers. They'd rather have the answer to a query all in the same place. So instead I'll now have one single page with everything people need to know about veneers; price, risks, procedure etc. All in one place.
Now there are further difficulties because google will sometimes consider two related things as different 'topics' or answering different questions. So I do have page for everything about veneers and also a page about veneers cost. In the case of veneers everyone wants to know the cost. It's all cost cost cost - so this is it's own topic and it's own page. But for something like root canals, nobody cares how much they cost, they just want to get out of pain. So the root canal cost section is on the main root canal page because it's included in the topic of 'root canals'.
It's now more about searcher intent https://moz.com/blog/how-google-gives-us-insight-into-searcher-intent-through-the-results-whiteboard-friday, possibly 'searcher task accomplishment' https://moz.com/blog/harnessing-link-equity and also how link equity flows: https://moz.com/blog/harnessing-link-equity
Also read this: https://www.cs.cornell.edu/home/kleinber/pcm.pdf it's tough going but just ignore what you don't understand and press on with reading it all and you'll learn a great amount about how google functions.
So to answer the question, you still need a solid site structure but i'd say 'siloing' is possibly going to dilute the potential power of each page. You're going to end up with 30 pages all about sub-sub topics that should be rethought out and consolidated using google as your research tool. Always use google as your research tool. To do anything else is like training for a sprint race by going swimming every day.
'Siloing' for me also created a ton of duplicate content, duplicate headlines and I even think I got stung by Maccabees for having some pages about all the different aspects of implant dentistry. They are now all consolidated into a 'super page' and it's ranking #1 locally and really well nationally too. Page one.
Imagine five pages, H1's are 'braces cost', 'braces procedure', 'braces on finance' and 'braces risks'. Google is going to struggle - in my view - to rank me for any of those because they all have an H1 containing the word braces. What would be better would be to have a'braces' page and then the H2's were all those sub sub topics and then an FAQ with all the google suggest words as H2 and then all the 'searchers also asked' words in the FAQ.
Hope this helps - this is my interpretation from my small local business here in the UK. So other users here may have more relevant information. For example IA, cannibalisation, internal conflict etc is much bigger in shopping and information businesses than it is in services businesses.
And of course this classic: https://moz.com/blog/optimizing-for-rankbrain-whiteboard-friday thanks to @miriam ellis for that one.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home Page Disappears From Google - But Rest of Site Still Ranked
As title suggests we are running into a serious issue of the home page disapearing from Google search results whilst the rest of the site still remains. We search for it naturally cannot find a trace, then use a "site:" command in Google and still the home page does not come up. We go into web masters and inspect the home page and even Google states that the page is indexable. We then run the "Request Indexing" and the site comes back on Google. This is having a damaging affect and we would like to understand why this issue is happening. Please note this is not happening on just one of our sites but has happened to three which are all located on the same server. One of our brand which has the issue is: www.henweekends.co.uk
Intermediate & Advanced SEO | | JH_OffLimits0 -
SEO effect of URL with subfolder versus parameters?
I'll make this quick and simple. Let's say you have a business located in several cities. You've built individual pages for each city (linked to from a master list of your locations). For SEO purposes is it better to have the URL be a subfolder, or a parameter off of the home page URL: https://www.mysite.com/dallas which is essentially https://www.mysite.com/dallas/index.php or http://www.mysite.com/?city=dallas which is essentially https://www.mysite.com/index.php?city=dallas
Intermediate & Advanced SEO | | Searchout0 -
Silo not ranking for main silo page - what can I do?
Hi everyone, I set up a silo for my page http://werkzeug-kasten.com/ . Unfortunately only the silos inner pages rank very good. These are for example http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/keyword-analyse/ for "Keywordanalyse SEO Freiburg" <a>http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/onpage-seo/</a> for "Onpage SEO Freiburg" ... but the silos main page <a>http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/</a> does not rank for "SEO Freiburg". Do you have any idea why that might be? Cheers, Marc
Intermediate & Advanced SEO | | RWW0 -
Recovered from Manual Penalty but rankings still suck
Hi All, We got a penalty Last March 2014 ( Side Wide Link - unnatural links) which we recovered from quickly and this changed to Partial Match penalty (impact links) which we recovered from back in December 2014. Our Site profile has been cleaned up but our rankings still suck for some of our main keywords (+500) . Also our traffic and local rankings still suck in some cases. From an SEO point of view our site is pretty good, we've done everything google has recommended including schema.org, mobile responsive, unique content (which we write regulary) and we only have a few duplicate pages. Our domain authority is better than our competitors but yet our rankings and traffic are still no way as good as theirs. Do anyone know if recovering from an impact links penalty take longer than 4 months . I know that google says than it discounts those links but I get the feeling google may be looking at an old dataset due to not rerunning panda & penguin since our penalty was removed and this may be whats affecting things. Does anyone have any ideas? I am more than happy to post my url if someone fancies taking a quick look ? to see if it's anything obvious ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Moz metrics are better than top10 competitors but still no progress
Hi Moz friends, So once in a while I encounter a challenge that I can't figure out so I thought maybe you can help.I would like to enter the top10 in Google.nl for a specific keyword and Moz's OSE is telling me all my metrics are better than most of my competitors. Als my on-page grade is on level (A) but I miss something .. somewhere. The awkward thing is the competition level is very easy .. Hope you guys can help, Cheers, Mark
Intermediate & Advanced SEO | | newtraffic0 -
All Thin Content removed and duplicate content replaced. But still no success?
Good morning, Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk. Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS. Can anyone tell me why we aren't making any progress or spot something we are not doing correctly? Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500). Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?
Intermediate & Advanced SEO | | editabletext0 -
Old pages still crawled by SE returning 404s. Better to put 301 or block with robots.txt ?
Hello guys, A client of ours has thousand of pages returning 404 visibile on googl webmaster tools. These are all old pages which don't exist anymore but Google keeps on detecting them. These pages belong to sections of the site which don't exist anymore. They are not linked externally and didn't provide much value even when they existed What do u suggest us to do: (a) do nothing (b) redirect all these URL/folders to the homepage through a 301 (c) block these pages through the robots.txt. Are we inappropriately using part of the crawling budget set by Search Engines by not doing anything ? thx
Intermediate & Advanced SEO | | H-FARM0