Duplicate Content based on www.www
-
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www.
I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www.
Does anyone know of any other things that may cause this and how we can go about remedying it?
-
Just a thanks again! I modded ISAPI Rewrite and resolved the problem thanks to your tip.
FYI in case anyone else needs it, here is the mod I did:
#redirecting www.www.domain.com to www.domain.com also redirect the net and org #RewriteCond Host: www.www.domainname.(com|net|org)
#RewriteRule (.*) http://domainname.com$2 [R] -
Thanks Keri - yes I have ISAPI Rewrite installed.
I will go that route and give it a shot.
-
I haven't done this in a long time, but I used ISAPI Rewrite, and did what is talked about in this thread at http://www.webmasterworld.com/microsoft_asp_net/3951672.htm. An alternate method is discussed at http://www.stepforth.com/resources/web-marketing-knowledgebase/non-www-redirect/#microsoft. Hope this helps you get started!
-
Hi Keri - it is IIS6
-
Are you running IIS6 or IIS7? Things are a little different for each of them. If you let us know what version of IIS you're running, we'll look up the proper configuration for you.
-
Hi Sanket,
The person asking the question is on IIS, so an .htaccess file won't help.
-
Thanks Sanket, your reply makes total sense and I would apply it if I could figure out how the extra www ( www.www ) is even happening. Wouldn't I have to know that first in order to implement the redirect?
In IIS I don't see where the extra www.www is setup (what directory)
If I could find it, it seems I could just create a .htaccess file with the code below:
Options +FollowSymlinks
RewriteEngine on
rewritecond %{http_host} ^www.www.domain.com [nc]
rewriterule ^(.*)$ http://www.domain.com/$1 [r=301,nc]Would that could be right?
I'm still digging on where the extra forward to www.www is happening.
-
Hi
If your sit is opened with or without WWW then obviously Google consider both different site and duplicate content. This will solve by setting rel="canonical" . 301 redirect is best option because it will transfer your back-links, page-rank and linkjuce so it is safe way of redirect URL. Upload .htaccess. file in your FTP server it helps you to solve this problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content from Wordpress Template
Hi Wondering if anyone can help, my site has flagged up with duplicate content on almost every page, i think this is because the person who set up the site created a lot of template pages which are using the same code but have slightly different features on. How would I go about resolving this? Would I need to recode every template page they have created?
Technical SEO | | Alix_SEO0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Duplicate Content
Hello guys, After fixing the rel tag on similar pages on the site I thought that duplicate content issue were resolved. I checked HTML Improvements on GWT and instead of going down as I expected, it went up. The duplicate issues affect identical product pages which differ from each other just for one detail, let's say length or colour. I could write different meta tags as the duplicate is the meta description, and I did it for some products but still didn't have any effects and they are still showing as duplicates. What would the problem be? Cheers
Technical SEO | | PremioOscar0 -
How to solve Parameter Issue causing Duplicate Content
Hi everyone, My site home page comes up in SERP with following url www.sitename/?referer=indiagrid My question is:- Should I disallow using robots.txt.? or 301 redirect to the home page Other issue is i have few dynamic generated URL's for a form http://www.www.sitename/career-form.php?position=SEO Executive I am using parameter "position" in URL Parameter in GWT. But still my pages are indexed that is leading to duplicate page content. Please help me out.
Technical SEO | | himanshu3019890 -
Duplicate Content Question (E-Commerce Site)
Hi All, I have a page that ranks well for the keyword “refurbished Xbox 360”. The ranking page is an eCommerce product details page for a particular XBOX 360 system that we do not currently have in stock (currently, we do not remove a product details page from the website, even if it sells out – as we bring similar items into inventory, e.g. more XBOX 360s, new additional pages are created for them). Long story short, given this way of doing things, we have now accumulated 79 “refurbished XBOX 360” product details pages across the website that currently, or at some point in time, reflected some version of a refurbished XBOX 360 in our inventory. From an SEO standpoint, it’s clear that we have a serious duplicate content problem with all of these nearly identical XBOX 360 product pages. Management is beginning to question why our latest, in-stock, XBOX 360 product pages aren't ranking and why this stale, out-of-stock, XBOX 360 product page still is. We are in obvious need of a better process for retiring old, irrelevant (product) content and eliminating duplicate content, but the question remains, how exactly is Google choosing to rank this one versus the others since they are primarily duplicate pages? Has Google simply determined this one to be the original? What would be the best practice approach to solving a problem like this from an SEO standpoint – 301 redirect all out of stock pages to in stock pages, remove the irrelevant page? Any thoughts or recommendations would be greatly appreciated. Justin
Technical SEO | | JustinGeeks0 -
Need help with Joomla duplicate content issues
One of my campaigns is for a Joomla site (http://genesisstudios.com) and when my full crawl was done and I review the report, I have significant duplicate content issues. They seem to come from the automatic creation of /rss pages. For example: http://www.genesisstudios.com/loose is the page but the duplicate content shows up as http://www.genesisstudios.com/loose/rss It appears that Joomla creates feeds for every page automatically and I'm not sure how to address the problem they create. I have been chasing down duplicate content issues for some time and thought they were gone, but now I have about 40 more instances of this type. It also appears that even though there is a canonicalization plugin present and enabled, the crawl report shows 'false' for and rel= canonicalization tags Anyone got any ideas? Thanks so much... Scott | |
Technical SEO | | sdennison0 -
Canonical usage and duplicate content
Hi We have a lot of pages about areas like ie. "Mallorca" (domain.com/Spain/Mallorca), with tabbed pages like "excursion" (domain.com/spain/Mallorca/excursions) and "car rental" (domain.com/Spain/Mallorca/car-rental) etc. The text on ie the "car rental"-page is very similar on Mallorca and Rhodos, and seomoz marks these as duplicate content. This happens on "car rental", "map", "weather" etc. which not have a lot of text but images and google maps inserted. Could i use rel=nex/prev/canonical to gather the information from the tabbed pages? That could show google that the Rhodos-map page is related to Rhodos and not Mallorca. Is that all wrong or/and is there a better way to do this? Thanks, Alsvik
Technical SEO | | alsvik0 -
SEO with duplicate content for 3 geographies
The client would like us to do seo for these 3 sites http://www.cablecalc.com/ http://www.solutionselectrical.com.au http://www.calculatecablesizes.co.uk/ The sites have to targetted in US, Australia, and UK resoectively .All the above sites have identical content. Will Google penalise the sites ? Shall we change the content completly ? How do we approach this issue ?
Technical SEO | | seoug_20050