Duplicate Content based on www.www
-
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www.
I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www.
Does anyone know of any other things that may cause this and how we can go about remedying it?
-
Just a thanks again! I modded ISAPI Rewrite and resolved the problem thanks to your tip.
FYI in case anyone else needs it, here is the mod I did:
#redirecting www.www.domain.com to www.domain.com also redirect the net and org #RewriteCond Host: www.www.domainname.(com|net|org)
#RewriteRule (.*) http://domainname.com$2 [R] -
Thanks Keri - yes I have ISAPI Rewrite installed.
I will go that route and give it a shot.
-
I haven't done this in a long time, but I used ISAPI Rewrite, and did what is talked about in this thread at http://www.webmasterworld.com/microsoft_asp_net/3951672.htm. An alternate method is discussed at http://www.stepforth.com/resources/web-marketing-knowledgebase/non-www-redirect/#microsoft. Hope this helps you get started!
-
Hi Keri - it is IIS6
-
Are you running IIS6 or IIS7? Things are a little different for each of them. If you let us know what version of IIS you're running, we'll look up the proper configuration for you.
-
Hi Sanket,
The person asking the question is on IIS, so an .htaccess file won't help.
-
Thanks Sanket, your reply makes total sense and I would apply it if I could figure out how the extra www ( www.www ) is even happening. Wouldn't I have to know that first in order to implement the redirect?
In IIS I don't see where the extra www.www is setup (what directory)
If I could find it, it seems I could just create a .htaccess file with the code below:
Options +FollowSymlinks
RewriteEngine on
rewritecond %{http_host} ^www.www.domain.com [nc]
rewriterule ^(.*)$ http://www.domain.com/$1 [r=301,nc]Would that could be right?
I'm still digging on where the extra forward to www.www is happening.
-
Hi
If your sit is opened with or without WWW then obviously Google consider both different site and duplicate content. This will solve by setting rel="canonical" . 301 redirect is best option because it will transfer your back-links, page-rank and linkjuce so it is safe way of redirect URL. Upload .htaccess. file in your FTP server it helps you to solve this problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tricky Duplicate Content Issue
Hi MOZ community, I'm hoping you guys can help me with this. Recently our site switched our landing pages to include a 180 item and 60 item version of each category page. They are creating duplicate content problems with the two examples below showing up as the two duplicates of the original page. http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=180&p=1 http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=60&p=1 The original page is http://www.uncommongoods.com/fun/wine-dine/beer-gifts I was just going to do a rel=canonical for these two 180 item and 60 item pages to the original landing page but then I remembered that some of these landing pages have page 1, page 2, page 3 ect. I told our tech department to use rel=next and rel=prev for those pages. Is there anything else I need to be aware of when I apply the canonical tag for the two duplicate versions if they also have page 2 and page 3 with rel=next and rel=prev? Thanks
Technical SEO | | znotes0 -
Categories VS Tag Duplicate Content
Hello Moz community, I have a question about categories and tags . Our customer www.elshow.pe just had a redesign of its website. We use the same categories listed before . The only change was that two sub categories were added ( these sub-categories were popular tags before ) .Then now I have 2 URL's covering the same content: The first is the URL of the subcategory : www.elshow.pe/realitys/combate/ The second is the URL that is generated by the tag "combate" that is www.elshow.pe/noticias/combate/ I have the same with the second sub category: "Esto es guerra" www.elshow.pe/realitys/esto-es-guerra/ www.elshow.pe/noticias/esto-es-guerra/ The problem is when I search the keyword "combate" in my country (Perú), the URL that positions is the tag URL in 1st page. But, when I search for "esto es guerra" the URL that positions is the **sub category **in the second page. I also check in OSE both links and sub categories goes better than tags. So what do you guys recommend for this? 301 redirect? canonicals? Any coment is welcome. Thanks a lot for your time. Italo,
Technical SEO | | neoconsulting
@italominano WmzlklG.png 1RKcoX8.png0 -
Despite canonical duplicate content in WMT
Hi, 2 weeks ago we've made big changes in title and meta descriptions. To solve the missing title and descriptions. Also set the right canonical. Now i see that in WMT despite the canonical it shows duplicates in meta descriptions and titles. i've setup the canonical like this:
Technical SEO | | Leonie-Kramer
1. url: www.domainname.com/category/listing-family/productname
2. url: www.domainname.com/category/listing-family/productname-more-info The canonical on both pages is like this: I'm aware of creating duplicate titles and descriptions, caused by the cms we use and also caused by wrong structure of category/products (we'll solve that nest year) that's why i wanted the canonical, but now it's not going any better, did i do something wrong with the canonical?0 -
Duplicate content. Wordpress and Website
Hi All, Will Google punish me for having duplicate blog posts on my website's blog and wordpress? Thanks
Technical SEO | | Mike.NW0 -
Duplicate Content due to CMS
The biggest offender of our website's duplicate content is an event calendar generated by our CMS. It creates a page for every day of every year, up to the year 2100. I am considering some solutions: 1. Include code that stops search engines from indexing any of the calendar pages 2. Keep the calendar but re-route any search engines to a more popular workshops page that contains better info. (The workshop page isn't duplicate content with the calendar page). Are these solutions possible? If so, how do the above affect SEO? Are there other solutions I should consider?
Technical SEO | | ycheung0 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Magento and Duplicate content
I have been working with Magento over the last few weeks and I am becoming increasingly frustrated with the way it is setup. If you go to a product page and remove the sub folders one by one you can reach the same product pages causing duplicate content. All magento sites seem to have this weakness. So use this site as an example because I know it is built on magento, http://www.gio-goi.com/men/clothing/tees/throve-t-short.html?cid=756 As you remove the tees then the clothing and men sub folders you can still reach the product page. My first querstion is how big an issue is this and two does anyone have any ideas of how to solve it? Also I was wondering how does google treat question marks in urls? Should you try and avoid them unless you are filtering? Thanks
Technical SEO | | gregster10001 -
Duplicate content across multiple domains
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain. My question relates to the removal of these pages. There are thousands of these duplicate pages. I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen? Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site. I suppose another option would be to include no cache meta tag for these pages. Any thoughts or comments would be appreciated.
Technical SEO | | jmsobe0