Silo vs breadcrumbs in 2015
-
Hi ive heard silos being mentioned in the past to help with rankings does this still apply?
and what about breadcrumbs do i use them with the silo technique or instead of which ones do you think are better or should i not be using these anymore with the recent google updates?
-
great thanks ill give that a go
-
It's been a while since I've used WP, but if you use posts (or posts and pages), you will have a major silo and duplicate content problem with blog category pages.
The way to solve this is to go to the section where you set up your post categories, and set the slug to be identical to your category page. For example, if you have a page category with the slug "blue-widgets", set the post category slug to "blue widgets". This makes the category page the parent for posts in that category.
There are also some adjustments that you will need to make to your URLs removing "/category/ from your URLs. I've done it, and it's pretty easy. Maybe another poster could give you the specifics.
-
great thanks very informative reply, i've started using wordpress for most of my sites now, is siloing easy enough to do in wordpress?
-
Silos will always work. It's not some trick - it's how Google works. Here's a very simplified explanation as to why...
Let's say that I have an eCommerce site, and I sell lawnmowers and Plywood. Let's also say that the Lawnmowers category page has a theoretical 100 points of link juice. Lets also say that the site sells 2 lawnmowers - the Fubar 2000 and the Toecutter 300. If the lawnmower category page only links to the Fubar 2000 and the Toecutter 300 pages, the category page will push 45 points of link juice to each page (pages can pass on +/-90% of their link juice, and 90/2=45).
Both pages will receive almost the full 45 point benefit because the pages are relevant to the category page.
If the Lawnmower category page instead only has 1 link to the Plywood page, the Lawnmower category page would push 90 points of link juice to the plywood page. But, the Plywood page would not receive the full benefit of the 90 points, because Lawnmowers and Plywood don't share much relevance. In this case, Google would heavily discount the 90 points, so that the Plywood page might only get the benefit of 30 points. Think of it as a leaky hose.
What happens to the other 60 Points of Link Juice? It gets dumped on the floor, and the site loses the ranking power of those 60 points.
Keep in mind that this is all theoretical, and that link juice comes in different flavors like apple, orange and prune, representing the different ranking factors (Trust, Authority, Topical Authority, Social Signals, etc.) . Orange might discount 90% while prune might only discount 10%. In this case, is there really a 67% link juice hit? Damned if I know, but I had to pick a number... This is all theoretical. I do know that link juice loss between pages that aren't relevant is dramatic. I also know that it is very possible to determine how your internal pages rank based on your internal link structure, and link placement on the page.
By siloing a website, I have seen rankings jump dramatically. Most websites hemorrhage link juice. Think of it as Link Juice Reclamation. The tighter you can build your silos, the less link juice gets dumped on the floor. By reclaiming the spilled link juice and putting it in the right places, you can dramatically increase your rankings. BTW, inbound links work in a similar fashion. If the Lawnmower page was an external site and linked to the Plywood page, the same discounts would apply. That's why it pays to get niche relevant backlinks for maximum benefit.
This in no way accounts for usability, and linking between silos can make sense to benefit end-users. Again, this model is probably overly simplified, and doesn't take into account Block Level Analysis, but the logic is sound. You can build spreadsheet models for link juice distribution factoring in Block level, discounts, etc. It's by no means accurate, but can give you a pretty good idea of where your link juice is going. You can model this on the old (and increasingly irrelevant) PageRank Algorithm. Pagerank is Logarithmic and it takes 8-9x as much link juice to move up in PR. If it takes 100 points of Link Juice to become a PR1, it takes 800-900 points to become a PR 2. Generally speaking a PR2 page, via links, can create roughly 7 to 75 PR1 pages, depending on how close the PR2 is to becoming a PR3.
-
Both is the way to go. Silos are essentially structuring your pages so that per topic, there is 1 master article and multiple supporting articles that link back to the master article. The topic only links to pages relevant to the topic and not other sections of the site.
You can use breadcrumbs in conjunction with a silo as the structure is suitable for them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Alternative text vs Image title attribute
Hi there. Anyone know how important Image title attribute is ? Moz is bringing it up saying make sure my images are all title attributed as well as alternative texted but no wedding photographers mention it in SEO posts or blogs. Click on wedding photographers blogs and most don't seem to have it (Image title attribute) either but I don't want to miss anything out that I need to be doing as I am building a new site and if it needs to be done now I have the time to do it. Any advice is appreciated thank you people x
Intermediate & Advanced SEO | | Howelljones0 -
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
URL Parameter Setting Recommendation - Webmaster Tools, Breadcrumbs & 404s
Hi All, We use a parameter called "breadCrumb" to drive the breadcrumbs on our ecommerce product pages that are categorized in multiple places. For example, our "Blue Widget" product may have the following URLs: http://www.oursite.com/item3332/blue-widget
Intermediate & Advanced SEO | | Doug_G
http://www.oursite.com/item3332/blue-widget_?breadCrumb=BrandTree_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree1_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree2_ We use a canonical tag pointing back to the base product URL. The parameter only changes the breadcrumbs. Which of the following, if any, settings would you recommend for such a parameter in GWT: Does this parameter change page content seen by the user? Options: Yes/No
How does this parameter affect page content? Options: Narrows/Specifies/Other Currently, google decided to automatically assign the parameter as "Yes/Other/Let Googlebot Decide" without notifying us. We noticed a drop in rankings around the suspected time of the assignment. Lastly, we have a consistent flow of products that are discontinued that we 404. As a result of the breadcrumb parameter, our 404s increase significantly (one for each path). Would 800 404 crawl errors out of 18k products cause a penalty on a young site? We got an "Increase in '404' pages' email from GWT, shortly after our rankings seemed to drop. Thank you for any advice or suggestions! Doug0 -
2 Duplicate E-commerce sites Risk vs Reward
Hi guys I have a duplicate content question I was hoping someone might be able to give me some advice on? I work for a small company in the UK and in our niche we have a huge product range and an excellent website providing the customer with a very good experience. We’re also backed up by a bespoke warehouse/logistic management system further enhancing the quality of our product. We get most traffic through PPC and are not one of the biggest brands in the industry and have to fight for marketshare. Recently we were approached by another company in our industry that have built up a huge and engaged audience over decades but can’t logistically tap into their following to sell the products so they have suggested a partnership. They are huge fans of what we do and basically want a copy of our site to be rebranded and hosted on a subdomain of their website and we would then pay them a commission of all the sales the new site received. So 2 identical sites with different branding would exist. Based on tests they have carried out we could potentially double our sales in weeks and the potential is huge so we are excited about the possibility. But…..how would we handle the duplicate content, would we be penalised? Would just one of the sites be penalised? Or if sales increase as much as we think they might, would it be worth a penalty as our current rankings aren’t great? Any advice would be great. Cheers Richard
Intermediate & Advanced SEO | | Rich_9950 -
Link cloaking in 2015\. Is it a bad idea now?
Hi everyone, I run a travel-related website and work with various affiliate partners. We have thousands of pages of well-written and helpful content, and many of these pages link off to one of our affiliates for booking purposes. Years ago I followed the prevailing wisdom and cloaked those links (bouncing them into a folder that was blocked in the robots.txt file, then redirecting them off to the affiliate). Basically, doing as Yoast has written: https://yoast.com/cloak-affiliate-links/ However, that seems kind of spammy and manipulative these days. Doesn't Google talk about not trying to manipulate links and redirect users? Could I just "nofollow" these links instead and drop the whole redirect charade? Could cloaking actually work against you? Thoughts? Thanks.
Intermediate & Advanced SEO | | TomNYC0 -
Troubled QA Platform - Site Map vs Site Structure
I'm running a Q&A forum that was built prioritizing UX over SEO. This decision has cause a bit of a headache as we're 6 months into the project with 2278 Q&A pages with extremely minimal traffic coming from search engines. The structure has the following hiccups: A. The category navigation from the main Q&A page is entirely javascript and only navigable by users. B. We identify Google bots and send them to another version of the Q&A platform w/o javascript. Category links don't exist in this google bot version of the main Q&A page. On this Google version of the main Q&A page, the Pinterest-like tiles displaying individual Q&As are capped at 10. This means that the only way google bot can identify link juice being passed down to individual QAs (after we've directed them to this page) is through 10 random Q&As. C. All 2278 of the QAs are currently indexed in search. They are just indexed very very poorly in SERPs. My personal assumption, is that Google can't pass link juice to any of the Q&As (poor SERP) but registers them from the site map so it gets included in Google's index. My dilemma has me struggling between two different decisions: 1. Update the navigation in the header to remove the javascript and fundamentally change the look and feel of the Q&A platform. This will allow Google bot to navigate through Expert category links to pass link juice to all Q&As. or 2. Update the redirected main Q&A page to include hard coded category links with 100s of hard coded Q&As under each category page. Make it similar, ugly, flat and efficient for the crawling bots. Any suggestions would be greatly appreciated. I need to find a solution as soon as possible.
Intermediate & Advanced SEO | | TQContent0 -
New my domain.com/blog option vs. my blog.mydomain.com option
Our e-commerce site has been on Big Commerce for about a year now. One thing many SEO folks had told us is that having a blog located at /blog was going to help more than a subdomain blog. option. BC has never had the option to have a blog hosted on their platform (/blog) until now. I am now wondering, since we have lost traffic in the past and are trying everything we can to regain it, if we should purchase the Wordpress Site Redirect upgrade and move the subdomain blog (blog.) to the new site option /blog. Any help or feedback from you is very much appreciated. I have attached a screenshot of our main website vs. our blog from Open Site Explorer in case it helps anything. I29Tw5P
Intermediate & Advanced SEO | | josh3300 -
Link + noindex vs canonical--which is better?
In this article http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66359 google mentions if you syndicate content, you should include a link and, ideally noindex, the content, if possible. I'm wondering why google doesn't mention including a canonical instead the link + noindex? Is one better than the other? Any ideas?
Intermediate & Advanced SEO | | nicole.healthline0