Duplicate Content based on www.www
-
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www.
I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www.
Does anyone know of any other things that may cause this and how we can go about remedying it?
-
Just a thanks again! I modded ISAPI Rewrite and resolved the problem thanks to your tip.
FYI in case anyone else needs it, here is the mod I did:
#redirecting www.www.domain.com to www.domain.com also redirect the net and org #RewriteCond Host: www.www.domainname.(com|net|org)
#RewriteRule (.*) http://domainname.com$2 [R] -
Thanks Keri - yes I have ISAPI Rewrite installed.
I will go that route and give it a shot.
-
I haven't done this in a long time, but I used ISAPI Rewrite, and did what is talked about in this thread at http://www.webmasterworld.com/microsoft_asp_net/3951672.htm. An alternate method is discussed at http://www.stepforth.com/resources/web-marketing-knowledgebase/non-www-redirect/#microsoft. Hope this helps you get started!
-
Hi Keri - it is IIS6
-
Are you running IIS6 or IIS7? Things are a little different for each of them. If you let us know what version of IIS you're running, we'll look up the proper configuration for you.
-
Hi Sanket,
The person asking the question is on IIS, so an .htaccess file won't help.
-
Thanks Sanket, your reply makes total sense and I would apply it if I could figure out how the extra www ( www.www ) is even happening. Wouldn't I have to know that first in order to implement the redirect?
In IIS I don't see where the extra www.www is setup (what directory)
If I could find it, it seems I could just create a .htaccess file with the code below:
Options +FollowSymlinks
RewriteEngine on
rewritecond %{http_host} ^www.www.domain.com [nc]
rewriterule ^(.*)$ http://www.domain.com/$1 [r=301,nc]Would that could be right?
I'm still digging on where the extra forward to www.www is happening.
-
Hi
If your sit is opened with or without WWW then obviously Google consider both different site and duplicate content. This will solve by setting rel="canonical" . 301 redirect is best option because it will transfer your back-links, page-rank and linkjuce so it is safe way of redirect URL. Upload .htaccess. file in your FTP server it helps you to solve this problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
ViewState and Duplicate Content
Our site keeps getting duplicated content flagged as an issue... however, the pages being grouped together have very little in common on-page. One area which does seem to recur across them is the ViewState. There's a minimum of 150 lines across the ones we've investigated. Could this be causing the reports?
Technical SEO | | RobLev0 -
Duplicate Content from Wordpress Template
Hi Wondering if anyone can help, my site has flagged up with duplicate content on almost every page, i think this is because the person who set up the site created a lot of template pages which are using the same code but have slightly different features on. How would I go about resolving this? Would I need to recode every template page they have created?
Technical SEO | | Alix_SEO0 -
Duplicate content issue with Wordpress tags?
Would Google really discount duplicate content created by Wordpress tags? I find it hard to believe considering tags are on and indexed by default and the vast majority of users would not know to deindex them . . .
Technical SEO | | BlueLinkERP0 -
Duplicate content + wordpress tags
According to SEOMoz platform, one of my wordpress websites deals with duplicate content because of the tags I use. How should I fix it? Is it loyal to remove tag links from the post pages?
Technical SEO | | giankar0 -
Tags causing Duplicate page content?
I was looking through the 'Duplicate Page Content' and Too Many On-Page Link' errors and they all seem to be linked to the 'Tags' on my blog pages. Is this really a problem and if so how should I be using tags properly to get the best SEO rewards?
Technical SEO | | zapprabbit1 -
Noindex duplicate content penalty?
We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit. Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex the forum pages hopefully it will not damage the rest of the site. In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.
Technical SEO | | Grumpy_Carl0 -
Query string in url - duplicate content?
Hi everyone I would appreciate some advice on the following. I have a page which has some nice content on but it also has a search functionality. When a search is run a querystrong is run. So i will get something like mypage.php?id=20 etc. With many different url potentials, will each query string be seen as a different page? If so i don't want duplicate content. So am i best putting canonical tags in the head tags on mypage.php ? to avoid Google seeing potential duplicate content. Many thanks for all your advice.
Technical SEO | | pauledwards0 -
Large Scale Ecommerce. How To Deal With Duplicate Content
Hi, One of our clients has a store with over 30,000 indexed pages but less then 10,000 individual products and make a few hundred static pages. Ive crawled the site in Xenu (it took 12 hours!) and found it to by a complex mess caused by years of hack add ons which has caused duplicate pages, and weird dynamic parameters being indexed The inbound link structure is diversified over duplicate pages, PDFS, images so I need to be careful in treating everything correctly. I can likely identify & segment blocks of 'thousands' of URLs and parameters which need to be blocked, Im just not entirely sure the best method. Dynamic Parameters I can see the option in GWT to block these - is it that simple? (do I need to ensure they are deinxeded and 301d? Duplicate Pages Would the best approach be to mass 301 these pages and then apply a no-index tag and wait for it to be crawled? Thanks for your help.
Technical SEO | | LukeyJamo0