Silo architecture and PR dilution! What's real?
-
Hi all,
Today I have gone through this "Silo" concept where we need to build 2nd hierarchy level pages and then lower hierarchy pages further to rank good for related terms of "keyword(s)". But I wonder, is it real? the so called Silo structure? Google may consider that we are trying trick if we create multiple pages (doorway pages) targeting same keyword. And one of my competitors is having too many 2nd hierarchy level pages against this Silo structure and even the homepage rank may dilute by contributing to the so many pages. But their web pages rank good for the keywords they chosen by creating multiple landing pages. These are contrary to each other. How it works in real?
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Can I get updated opinions on PR Web?
I saw Moz has discussed PR web in earlier posts, but they are mostly months to years old. I'm wondering if PR Web is a good service? A lot of my competitors use it, but it seems just like a paid link to me. If for whatever reason, PR Web is an approved loophole, does anyone have any suggestions on which plan to purchase? Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
HELP - Site architecture of E-Commerce Mega Menu - Linkjuice flow
Hi everyone, I hope you have a couple of mins to give me your opinion. Ecommerce site has around 2000 products, in english and spanish, and around only 70 hits per day if that. We have done a lot of optimisation on the site - Page Titles, URL's, Content, H1's, etc.... Everything on page is pretty much under control, except I am starting to realise the site architecture could be harming our SEO efforts. Once someone arrives on site they are language detected and do a 302 to either domain.com/EN or domain.com/ES depending on their preferred language. Then on the homepage, we have the big MEGA MENU - and we have
White Hat / Black Hat SEO | | bjs2010
CAT 1
SubCat 1
SubsubCat 1
SubsubCat 2
SubsubCat 3 Overall, there are 145 "categories". Plus links to some CMS pages, like Home, Delivery terms, etc... Each Main Category, contains the products of everything related to that category - so for example:
KITCHENWARE
COOKWARE BAKINWARE
SAUCEPANS BOWLS
FRYING PANS Kitchenware contains: ALL PRODUCTS OF SUBCATS BELOW, SO COOKWARE ITEMS, SAUCEPANS, FRYING PANS, BAKINGWARE, etc... plus links to those categories through breadcrumbs and a left hand nav in addition to the mega menu above. So once the bots hit the site, immediately they have this structure to deal with. Here is what stats look like:
Domain Authority: 18 www.domain.com/EN/
PA: 27
mR: 3.99
mT: 4.90 www.domain.com/EN/CAT 1
PA: 15
mR: 3.05
mT: 4.54 www.domain.com/EN/CAT 1/SUBCAT1
PA: 15
mR: 3.05
mT: 4.54 Product pages themselves - have a PA of 1 and no mR or mT. I really need some other opinions here - I am thinking of: Removing links in Nav menu so it only contains CAT1 and SUBCAT1 but DELETE SUBSUBCATS1 which represent around 80 links Remove products within the CAT1 page - eg., the CAT 1 would "tile" graphical links to subcategories, but not display products themselves. So products are only available right at the lowest part of the chain (which will be shortened) But I am willing to hear any other ideas please - maybe another alternative is to start building links to boost DA and linkjuice? Thanks all, Ben0 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0 -
Very Big Pr blogs
Hi there, I was wonder how come a simple blog with 1-2 page ..not very old (<1 year) ..can get Pr 7 or Pr 8? Take a look: Mit4.info pr 6 , 6 months old... http://odubai.info/ Pr 7 esenderlink.info Pr 8 !!! webstreamingsmania.com/ PR 9 ! Very strange! Can someone explain how they got this <acronym title="Google Page Ranking">PR</acronym> and if they transfer anything if we post links there? Thanks in advance!
White Hat / Black Hat SEO | | willyg0 -
How to run SEO tests you don't want to be associated with
A client has a competitor who is ranking above them for a highly competitive term they shouldn't really be able to rank for. I think I know how the site got there, and I think I can replicate it myself with a quick test, but it's definitely grey hat if not black hat to do so. I do not want my own sites and company to be damamged by the test, but i'd like to let the client know for sure, and also i'd love to know myself. The test should take about a week to run, there is no hacking involved or password stealing or anything damaging to another. How would you do such a test? I'm dubious about using my own server / site for it, but would a week really matter? Tom
White Hat / Black Hat SEO | | lethal0r0 -
Massive rank drop for 'unnatural links' . Help!
Hi Everyone, I work for a company called Danbro - www.danbro.co.uk Recently a massive penalty lead to a huge drop across all keywords in Google including the brand name. Since we have conducted a massive clean up; (requesting competitors to remove duplicate content, removing some poor quality links etc etc) We still have not seen any improvement whatsoever nor has Google responded. Has anyone ever received a positive response from Google? Since we sent a reconsideration request our ranks actually went worse!! Any advice would be great
White Hat / Black Hat SEO | | Townpages0