Best way to address duplicate news sections within site
-
A client has a news section at www.clientsite.com/news and also at subdomain.clientsite.com/news. The stories within each section are identical:
www.clientsite.com/news/story-11-5-2011
subdomain.clientsite.com/news/story-11-5-2011
What's the best way to avoid a duplicate content issue within the site? A 301 redirect doesn't seem appropriate from the user experience point of view.
Is applying a rel=canonical <www.clientsite.com news="" story-a-b-c="">to each story within the subdomain news section the best option? They have 100's of stories, wondering if there might be an easier way?</www.clientsite.com>
Also, the news pages list the story headline and the first 3 lines of copy. Do these summaries present duplicate content issues with the full story page?
Thank you!
-
Alan, I appreciate your effort here. These are the sources I already shared
A complete summary of everything shared in those articles you quote:
1. It doesn't make a difference to google which method is used. When I examine all the information and analysis, it seems to indicate Google will index the content either way. How well that content will rank in Google is a different topic. There are reasons to keep content separate, such as when discussing topics unrelated to the main site, in which case a subdomain would be best.
2. Matt uses the directory approach, and he recommends for others to do the same.
AT BEST you can get that it is close to even with a slighter preference towards subfolders based on that information.
The Rand offers outstanding analysis as to why subfolders are the superior choice. Rand's analysis is in 2009, 2 years after the original articles quoted from Matt. http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
The bottom line, it's up to you how much you care about your site and it's performance. Personally, I am a fighter. I also micro-manage website architecture because in many aspects, it is a one-time set it and forget it type of thing. Whether to use subdirectories vs subfolders, whether to use underscores in URLs vs dashes, etc. are things you do one time and then it is automated forever.
A detailed list of reasons supporting the subfolder approach has been offered. The DA, time, costs, etc. all support subfolders. If you wish to ignore all those strong, positive benefits and go with a subdomain then that is your choice.
Good luck.
-
The originals
http://googlewebmastercentral.blogspot.com/2008_01_01_archive.htmlhttp://www.mattcutts.com/blog/subdomains-and-subdirectories/
here is a better example from Matt
Deb December 11, 2007 at 1:01 am
<dd class="comment odd alt thread-odd thread-alt depth-1">
Matt thanks for your reply, just a query (if you don’t mind) if I add content in mattcutts.com/blog – it effect in seo because I add directly content in the domain mattcutts.com but if I add content in blog.mattcutts.com is the effect is same? I don’t think so – because this is a subdomain not directly related with the domain?
If I disturb you please don’t mindThanks
Deb</dd>
<dd class="comment odd alt thread-odd thread-alt depth-1">Matt Cutts December 10, 2007 at 10:55 am</dd>
<dd class="comment byuser comment-author-matt-cutts bypostauthor odd alt thread-odd thread-alt depth-1">
Deb, it really is a pretty personal choice. For something small like a blog, it probably won’t matter terribly much. I used a subdirectory because it’s easier to manage everything in one file storage space for me. However, if you think that someday you might want to use a hosted blog service to power your blog, then you might want to go with blog.example.com just because you could set up a CNAME or DNS alias so that blog.example.com pointed to your hosted blog service.
</dd>
I was trying to find video matt made where he makes a simular claim. but i have to get back to work
-
Alan,
We will have to agree to disagree on this one.
There is a ton of what can only be referred to as "SEO bullshit" published. When I quote a source it will usually be Matt Cutts directly, or Google, or a highly respected SEO who shares an opinion on a topic AND who offers very solid research to back up that opinion. In short, credibility is everything when quoting a source to support a given position.
You are quoting a site I have never heard of, alexander.holbreich.org. Is it just me? Do others know and recognize this site as a reputable source of SEO information?
The author's About page is a total of 4 lines of text. Line 1 = his name, Line 3 & 4 is where he lives. Line 2 = he has a degree in "Business Information" but doesn't even state where or when he received this degree. This web page is a solid example of a page that has absolutely zero trust on SEO.
I think it is great that you read various sources of SEO for ideas, but that is a big difference from depending on those sources as credible information.
If you want to quote, try the main source article. Doing such would add higher credibility to your position. I can agree there is a lot of confusion on this topic, but it is propagated mostly by pages like the one you linked which should probably never be read.
Using the source you quoted and some common ground I would share the following:
-
Matt Cutts stated he uses folders "My personal preference on subdomains vs. subdirectories is that I usually prefer the convenience of subdirectories for most of my content. A subdomain can be useful to separate out content that is completely different."
-
Matt Cutts recommended for others to use folders "If you’re a newer webmaster or SEO, I’d recommend using subdirectories until you start to feel pretty confident with the architecture of your site."
-
Matt shared a specific example of when a subdirectory would be appropriate, and it is an example I had shared as well in response to the original question "A subdomain can be useful to separate out content that is completely different. Google uses subdomains for distinct products such news.google.com or maps.google.com, for example."
The above aside, one site is easier to maintain then two. There are lower costs all around (software, trust badges, SSL, etc). There is less time involved as well. All that time and money can be put into other aspects of SEO such as link building and creating great content.
Further, by combining your content into one site, all your content benefits from the higher DA of your site.
I hope you take the information I am sharing the right way Alan. My professional experience leads me to almost always use a folder unless there is a clear and specific reason to use a subdomain such as trying to separate out content which is not related to the main site. The difference is strong enough to where I would recommend for most clients who have a subdomain to delete it and move to the subfolder structure.
If you find a differing opinion, I would love to hear it. All I ask is for it to be from a highly credible SEO source who preferably shares detailed examples or logic to support the position.
Best Regards,
-
-
"With respect to the general subfolder vs domain discussion, as far as I have seen most of the "debate" ended with subfolders being the winner."
For what reasons is it the winner? I use subdomains a lot, thats why I have looked for evidence, and Matt Cutts has stated it makes no difference.
Rand states, it is his personal belief, but google and Matt Cutts have stated many times it makes no difference to rankings
http://alexander.holbreich.org/2008/01/subdomains-vs-subdirectories/" otherwise irrelevant change during this discussion only serves to confuse an otherwise muddy topic"
I dont think its confusion, it is information clearly stated (not to do with rankings) for one to consider. it is an indication of googles thinking. It is stated correcly and all informmation should be considered. One could say that stating rands personal belief is confusing.
-
I take a different view on this topic then Alan.
As Alan mentioned, the recent Google change sole effect is how links to sub-domains from the root domain visually appear in Google WMT. They have absolutely no ranking weight difference. Bringing up that otherwise irrelevant change during this discussion only serves to confuse an otherwise muddy topic.
With respect to the general subfolder vs domain discussion, as far as I have seen most of the "debate" ended with subfolders being the winner.
There are a couple situations where a subdomain would be preferable to a folder. One example is when a different, unrelated topic or product is being offered. Keith, you brought up the example of Google Maps. A few comments I would share:
-
Google Maps is a different product then Google search. Really the main thing they have is they are being offered by the same company. The idea of providing satellite images and driving directions is really quite different then providing the best search results. These two products happen to be offered by the same company but if you think about it, they are really very distinct products. It would be the same idea if Ford created their own version of Sirius radio. Yes, the radios would be offered in Ford cars but the product is truly distinct of the cars and can stand completely alone.
-
Google's site was set up years ago before this topic was analyzed to this depth. Many changes have been made over the years.
A couple great discussions on this topic:
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
A quote Rand shared in a different article "99.9% of the time, if a subfolder will work, it's the best choice for all parties." I agree for the overwhelming majority of cases, a subfolder is preferred. There are some corner cases but normally speaking the subfolder is the preferred approach.
-
-
Subdomains or folder is an old debaiting point, but matt cutts has said it makes no difference.
I have also noticed that google includes subdomain links in its site links, as well as google WMT now shows subdomain links as internal(I know this is seperate to ranking, but it makes but with the other evidence it gives weight to what matt cutts stated). -
Good catch on the subdomains! That is a separate issue, and I am recommending they move everything to a clientsite.com/folder setup. The sub-domains do have unique content (except for the news) and they set it up that way because they've seen other sites, like Google, set up sub-domains for maps and their other products.
What's a good explanation to the client for why other large sites like Google set up different content sections as subdomains vs. the folder approach I am recommending?
-
the news pages list the story headline and the first 3 lines of copy. Do these summaries present duplicate content issues with the full story page?
No
With respect to the subdomain, what is the purpose of having the subdomain? It seems likely the best course of action would be to merge any unique content from the subdomain into the main site, then remove the subdomain. Your articles would benefit from the (presumably) stronger DA on the main site. Also your efforts would be reduced by allowing you to fully focus on one site rather then maintain two sites.
How does this subdomain benefit anyone?
If you insisted on keeping the subdomain, then yes the canonical meta tag would work.
-
canonical would be best here. but you would want to do it with code, or use rewrite outbound rules on the server
I would not worry about the sumery problem
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site redesign makes Moz Site Crawl go haywire
I work for an agency. Recently, one of our clients decided to do a complete site redesign without giving us notice. Shortly after this happened, Moz Site Crawl reported a massive spike of issues, including but not limited to 4xx errors. However, in the weeks that followed, it seemed these 4xx errors would disappear and then a large number of new ones would appear afterward, which makes me think they're phantom errors (and looking at the referring URLs, I suspect as much because I can't find the offending URLs). Is there any reason why this would happen? Like, something wrong with the sitemap or robots.txt?
Technical SEO | | YYSeanBrady1 -
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
301 Multiple Sites to Main Site
Over the past couple years I had 3 sites that sold basically the same products and content. I later realized this had no value to my customers or Google so I 301 redirected Site 2 and Site 3 to my main site (Site 1). Of course this pushed a lot of page rank over to Site 1 and the site has been ranking great. About a week ago I moved my main site to a new eCommerce platform which required me to 301 redirect all the url's to the new platform url's which I did for all the main site links (Site 1). During this time I decided it was probably better off if I DID NOT 301 redirect all the links from the other 2 sites as well. I just didn't see the need as I figured Google realized at this point those sites were gone and I started fearing Google would get me for Page Rank munipulation for 301 redirecting 2 whole sites to my main site. Now I am getting over 1,000 404 crawl errors in GWT as Google can no longer find the URL's for Site 2 and Site 3. Plus my rankings have dropped substantially over the past week, part of which I know is from switching platforms. Question, did I make a mistake not 301 redirecting the url's from the old sites (Site 2 and Site 3) to my new ecommerce url's at Site 1?
Technical SEO | | SLINC0 -
Product landing page URL's for e-commerce sites - best practices?
Hi all I have built many e-commerce websites over the years and with each one, I learn something new and apply to the next site and so on. Lets call it continuous review and improvement! I have always structured my URL's to the product landing pages as such: mydomain.com/top-category => mydomain.com/top-category/sub-category => mydomain.com/top-category/sub-category/product-name Now this has always worked fine for me but I see more an more of the following happening: mydomain.com/top-category => mydomain.com/top-category/sub-category => mydomain.com/product-name Now I have read many believe that the longer the URL, the less SEO impact it may have and other comments saying it is better to have the just the product URL on the final page and leave out the categories for one reason or another. I could probably spend days looking around the internet for peoples opinions so I thought I would ask on SEOmoz and see what other people tend to use and maybe establish the reasons for your choices? One of the main reasons I include the categories within my final URL to the product is simply to detect if a product name exists in multiple categories on the site - I need to show the correct product to the user. I have built sites which actually have the same product name (created by the author) in multiple areas of the site but they are actually different products, not duplicate content. I therefore cannot see a way around not having the categories in the URL to help detect which product we want to show to the user. Any thoughts?
Technical SEO | | yousayjump0 -
Would duplicate listings effect a client's ranking if they used same address?
Lots of duplication on directory listings using similar or same address, just different company names... like so-and-so carpet cleaning; and another listing with so-and-so janitorial services. Now my client went from a rank around 3 - 4 to not even in the top 50 within a week. -- -- -- Would duplication cause this sudden drop? Not a lot of competition for a client using keyword (janitorial services nh); -- -- -- would a competitor that recently optimized a site cause this sudden drop? Client does need to optimize for this keyword, and they do need to clean up this duplication. (Unfortunately this drop happened first of March -- I provided the audit, recommendations/implementation and still awaiting the thumbs up to continue with implementation). --- --- --- Did Google make a change and possibly find these discrepancies within listings and suddenly drop this client's ranking? And they there's Google Places:
Technical SEO | | CeCeBar
Client usually ranks #1 for Google Places with up to 12 excellent reviews, so they are still getting a good spot on the first page. The very odd thing though is that Google is still saying that need to re-verify their Google places. I really would like to know for my how this knowledge how a Google Places account could still need verification and yet still rank so well within Google places on page results? because of great reviews? --- Any ideas here, too? _Cindy0 -
Once we make changes to our site is there a way to force the engines to re-crawl it faster?
After we implement canonicals URLs, or make some other significant change to our site that is going to impact our SEO, is there a way to force Google or other search engines to re-index us faster? Would manually re-submitting a sitemap do this?
Technical SEO | | askotzko0 -
Duplicate Page Title
First i had a problem with duplicate title errors, almost every page i had was double because my website linked to both www.funky-lama.com and funky-lama.com I changed this by adding a code to htaccess to redirect everything to www.funky-lama.com, but now my website was crawled again and the errors were actually doubled. all my pages now have duplicate title errors cause of pages like this www.funky-lama.com/160-confetti-gitaar.html funky-lama.com/160-confetti-gitaar.html www.funky-lama.com/1_present-time funky-lama.com/1_present-time
Technical SEO | | funkylama0