Duplicate Content based on www.www
-
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www.
I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www.
Does anyone know of any other things that may cause this and how we can go about remedying it?
-
Just a thanks again! I modded ISAPI Rewrite and resolved the problem thanks to your tip.
FYI in case anyone else needs it, here is the mod I did:
#redirecting www.www.domain.com to www.domain.com also redirect the net and org #RewriteCond Host: www.www.domainname.(com|net|org)
#RewriteRule (.*) http://domainname.com$2 [R] -
Thanks Keri - yes I have ISAPI Rewrite installed.
I will go that route and give it a shot.
-
I haven't done this in a long time, but I used ISAPI Rewrite, and did what is talked about in this thread at http://www.webmasterworld.com/microsoft_asp_net/3951672.htm. An alternate method is discussed at http://www.stepforth.com/resources/web-marketing-knowledgebase/non-www-redirect/#microsoft. Hope this helps you get started!
-
Hi Keri - it is IIS6
-
Are you running IIS6 or IIS7? Things are a little different for each of them. If you let us know what version of IIS you're running, we'll look up the proper configuration for you.
-
Hi Sanket,
The person asking the question is on IIS, so an .htaccess file won't help.
-
Thanks Sanket, your reply makes total sense and I would apply it if I could figure out how the extra www ( www.www ) is even happening. Wouldn't I have to know that first in order to implement the redirect?
In IIS I don't see where the extra www.www is setup (what directory)
If I could find it, it seems I could just create a .htaccess file with the code below:
Options +FollowSymlinks
RewriteEngine on
rewritecond %{http_host} ^www.www.domain.com [nc]
rewriterule ^(.*)$ http://www.domain.com/$1 [r=301,nc]Would that could be right?
I'm still digging on where the extra forward to www.www is happening.
-
Hi
If your sit is opened with or without WWW then obviously Google consider both different site and duplicate content. This will solve by setting rel="canonical" . 301 redirect is best option because it will transfer your back-links, page-rank and linkjuce so it is safe way of redirect URL. Upload .htaccess. file in your FTP server it helps you to solve this problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content Issue
Hello, I recently solved www / no www duplicate issue for my website, but now I am in trouble with duplicate content again. This time something that I cannot understand happens: In Crawl Issues Report, I received Duplicate Page Content for http://yourappliancerepairla.com (DA 19) http://yourappliancerepairla.com/index.html (DA 1) Could you please help me figure out what is happenning here? By default, index.html is being loaded, but this is the only index.html I have in the folder. And it looks like the crawler sees two different pages with different DA... What should I do to handle this issue?
Technical SEO | | kirupa0 -
Gallery Causing Duplicate Content Issues
Hi! I have a gallery on my website. When you click to view the next image it goes to a new page but the content is exactly the same as the first page. This is flagging up a duplicate content issue. What is the best way to fix this? Add a canonical tag to page 2,3,4 or add a noindex tag? I have found a lot of conflicting answers. Thanks in advance
Technical SEO | | emma19860 -
How to fix HTTP/HTTPS duplicate content
I recently installed an SSL certificate on the site: https://libertywholesalesupply.com Moz is now reading thousands of duplicate content pages because it is reading both http and https. I set up the configuration in Magento to auto-redirect the base URL, created a permanent redirect for the URL in the SEO settings, and adjusted the canonical settings. What am I missing??
Technical SEO | | adamxj20 -
How different should content be so that it is not considered duplicate?
I am making a 2nd website for the same company. The name of the company, our services, keywords and contact info will show up several times within the text of both websites. The overall text and paragraphs will be different but some info may be repeated on both sites. Should I continue this? What precautions should I take?
Technical SEO | | savva0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
How to get rid of duplicate content
I have duplicate content that looks like http://deceptionbytes.com/component/mailto/?tmpl=component&link=932fea0640143bf08fe157d3570792a56dcc1284 - however I have 50 of these all with different numbers on the end. Does this affect the search engine optimization and how can I disallow this in my robots.txt file?
Technical SEO | | Mishelm1 -
If two websites pull the same content from the same source in a CMS, does it count as duplicate content?
I have a client who wants to publish the same information about a hotel (summary, bullet list of amenities, roughly 200 words + images) to two different websites that they own. One is their main company website where the goal is booking, the other is a special program where that hotel is featured as an option for booking under this special promotion. Both websites are pulling the same content file from a centralized CMS, but they are different domains. My question is two fold: • To a search engine does this count as duplicate content? • If it does, is there a way to configure the publishing of this content to avoid SEO penalties (such as a feed of content to the microsite, etc.) or should the content be written uniquely from one site to the next? Any help you can offer would be greatly appreciated.
Technical SEO | | HeadwatersContent0 -
Using robots.txt to deal with duplicate content
I have 2 sites with duplicate content issues. One is a wordpress blog. The other is a store (Pinnacle Cart). I cannot edit the canonical tag on either site. In this case, should I use robots.txt to eliminate the duplicate content?
Technical SEO | | bhsiao0