Silo vs breadcrumbs in 2015
-
Hi ive heard silos being mentioned in the past to help with rankings does this still apply?
and what about breadcrumbs do i use them with the silo technique or instead of which ones do you think are better or should i not be using these anymore with the recent google updates?
-
great thanks ill give that a go
-
It's been a while since I've used WP, but if you use posts (or posts and pages), you will have a major silo and duplicate content problem with blog category pages.
The way to solve this is to go to the section where you set up your post categories, and set the slug to be identical to your category page. For example, if you have a page category with the slug "blue-widgets", set the post category slug to "blue widgets". This makes the category page the parent for posts in that category.
There are also some adjustments that you will need to make to your URLs removing "/category/ from your URLs. I've done it, and it's pretty easy. Maybe another poster could give you the specifics.
-
great thanks very informative reply, i've started using wordpress for most of my sites now, is siloing easy enough to do in wordpress?
-
Silos will always work. It's not some trick - it's how Google works. Here's a very simplified explanation as to why...
Let's say that I have an eCommerce site, and I sell lawnmowers and Plywood. Let's also say that the Lawnmowers category page has a theoretical 100 points of link juice. Lets also say that the site sells 2 lawnmowers - the Fubar 2000 and the Toecutter 300. If the lawnmower category page only links to the Fubar 2000 and the Toecutter 300 pages, the category page will push 45 points of link juice to each page (pages can pass on +/-90% of their link juice, and 90/2=45).
Both pages will receive almost the full 45 point benefit because the pages are relevant to the category page.
If the Lawnmower category page instead only has 1 link to the Plywood page, the Lawnmower category page would push 90 points of link juice to the plywood page. But, the Plywood page would not receive the full benefit of the 90 points, because Lawnmowers and Plywood don't share much relevance. In this case, Google would heavily discount the 90 points, so that the Plywood page might only get the benefit of 30 points. Think of it as a leaky hose.
What happens to the other 60 Points of Link Juice? It gets dumped on the floor, and the site loses the ranking power of those 60 points.
Keep in mind that this is all theoretical, and that link juice comes in different flavors like apple, orange and prune, representing the different ranking factors (Trust, Authority, Topical Authority, Social Signals, etc.) . Orange might discount 90% while prune might only discount 10%. In this case, is there really a 67% link juice hit? Damned if I know, but I had to pick a number... This is all theoretical. I do know that link juice loss between pages that aren't relevant is dramatic. I also know that it is very possible to determine how your internal pages rank based on your internal link structure, and link placement on the page.
By siloing a website, I have seen rankings jump dramatically. Most websites hemorrhage link juice. Think of it as Link Juice Reclamation. The tighter you can build your silos, the less link juice gets dumped on the floor. By reclaiming the spilled link juice and putting it in the right places, you can dramatically increase your rankings. BTW, inbound links work in a similar fashion. If the Lawnmower page was an external site and linked to the Plywood page, the same discounts would apply. That's why it pays to get niche relevant backlinks for maximum benefit.
This in no way accounts for usability, and linking between silos can make sense to benefit end-users. Again, this model is probably overly simplified, and doesn't take into account Block Level Analysis, but the logic is sound. You can build spreadsheet models for link juice distribution factoring in Block level, discounts, etc. It's by no means accurate, but can give you a pretty good idea of where your link juice is going. You can model this on the old (and increasingly irrelevant) PageRank Algorithm. Pagerank is Logarithmic and it takes 8-9x as much link juice to move up in PR. If it takes 100 points of Link Juice to become a PR1, it takes 800-900 points to become a PR 2. Generally speaking a PR2 page, via links, can create roughly 7 to 75 PR1 pages, depending on how close the PR2 is to becoming a PR3.
-
Both is the way to go. Silos are essentially structuring your pages so that per topic, there is 1 master article and multiple supporting articles that link back to the master article. The topic only links to pages relevant to the topic and not other sections of the site.
You can use breadcrumbs in conjunction with a silo as the structure is suitable for them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How critical is page speed: average vs fast is it worth the effort?
Hi most of our pages are rated as "medium" for optimisation by the google page speed tool and with an average speed of 2.5 seconds compared to my competitors whose pages are rates as "fast" and have a speed of less than 1 second! We use the DNN platform & my web dev has said that he cant squeeze anymore from the platform and that to get the same speed as my competitors will need a platform development which it out of our control. My question is, is it worth moving our site to another platform that can achieve these fast speeds or is the difference between medium and fast not worth it - i am mainly looking at it from a google ranking signal perspective. Ash
Intermediate & Advanced SEO | | AshShep10 -
2 Duplicate E-commerce sites Risk vs Reward
Hi guys I have a duplicate content question I was hoping someone might be able to give me some advice on? I work for a small company in the UK and in our niche we have a huge product range and an excellent website providing the customer with a very good experience. We’re also backed up by a bespoke warehouse/logistic management system further enhancing the quality of our product. We get most traffic through PPC and are not one of the biggest brands in the industry and have to fight for marketshare. Recently we were approached by another company in our industry that have built up a huge and engaged audience over decades but can’t logistically tap into their following to sell the products so they have suggested a partnership. They are huge fans of what we do and basically want a copy of our site to be rebranded and hosted on a subdomain of their website and we would then pay them a commission of all the sales the new site received. So 2 identical sites with different branding would exist. Based on tests they have carried out we could potentially double our sales in weeks and the potential is huge so we are excited about the possibility. But…..how would we handle the duplicate content, would we be penalised? Would just one of the sites be penalised? Or if sales increase as much as we think they might, would it be worth a penalty as our current rankings aren’t great? Any advice would be great. Cheers Richard
Intermediate & Advanced SEO | | Rich_9950 -
508 compliance vs good SEO re: Image alt tags
I'm currently in debate with our 508 compliance team over the use of alt tags on images. For SEO, it is best practice to use alt tags so that readers can tell what the image represents. However, they are arguing that these images should NOT have alt text as it doesn't add anything to the disability screen reader as the image text would be repetitive with the text on the page. I feel they are taking the "decorative" image concept in 508 compliance too far. It's intention is for images for bullets, etc that truly are decorative in nature and add no benefit to the reader. What is the communities thoughts on this? Have you ever run into scenario where 508 is attempting to ruin SEO? Usually the 2 play nicely.
Intermediate & Advanced SEO | | jpfleiderer0 -
.ac.uk subdomain vs .co.uk domain
I'd be grateful if I could check my thinking... I've agreed to give some quick advice to a non profit organisation who are in the process of moving their website from an ac.uk subdomain to a .co.uk domain. They believe that their SEO can be improved considerably by making this migration. From my experience, I don't see how this could be the case. Does the unique domain in itself offer enough ranking benefit to justify this approach? The subdomain is on a very high authority domain with many pre-existing links, which makes me even more nervous about this approach. Does anyone have any opinions on this that they could share please? I'm guessing that it is possible to migrate safely and that there might be branding advantages, but from an actual SEO point of view there is not that much benefit? It looks like most of their current traffic is branded traffic.
Intermediate & Advanced SEO | | RG_SEO0 -
New-york-city vs. broadway as a URL parameter
We're a content publisher that writes news and reviews about the theater community, both in New York City (broadway mainly) and beyond. Presently, we display the term 'new-york-city' in news articles about Broadway / New York City theater (see http://screencast.com/t/XlifMdT9QP). Would it be better for us to replace that term with simply 'Broadway' to improve its searchability? I was doing some google trends keyword research and it looks like the search term "Broadway" in various permutations is substantially more popular than "New York City Theater."
Intermediate & Advanced SEO | | TheaterMania0 -
Schema Tag and RDF Microdata for Breadcrumbs
Can someone assist to analyse is the Schema tag and RDF Microdata is correct - http://www.mycarhelpline.com/index.php?option=com_newcar&view=product&Itemid=2&id=106&vid=361 - http://www.mycarhelpline.com/index.php?option=com_newcar&view=product&Itemid=2&id=22&vid=6 Reason - am asking is there are many sites reported on that the rich snippet though shows but in actual the RDFMicrodata does not show in on search engines many thanks
Intermediate & Advanced SEO | | Modi0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Directory VS Article Directory
Which got hit harder in penguin update. I was looking at SEER Interactive backlink profile (the SEO company that didn't rank for it's main keyword phrases) and noticed a pretty big trend on why it might not rank for its domain name. SEER was in a majority of anchor text, many coming from directories. i'm guessing THEY were effected because they matched the exact match domain link profile rule I'm not an expert programmer, but if i was playing "Google Programmer" I would think the Algo update went something like. If ((exact match domain) & (certain % anchor text==domain) & (certain % of anchor text== partial domain + services/company)) { tank the rankings } So back to the question, do you think that this update had a lot to do with directories, article directories, or neither. Is article directories still a legit way to get links. (not ezine)
Intermediate & Advanced SEO | | imageworks-2612900