Root domain not resolving to www. Duplicate content?
-
Hi,
I'm working with a domain that stays on the root domain if the www is not included.
But if the www is included, it stays with the www.
LIke this:
example.com
or
www.example.comOf course, they are identical and both go to the same IP.
Do search engines consider that to be duplicate content?
thanks,
michael -
Sorry, I forgot to remove the .au
-
Hey thanks Gyorgy. I'm going to try your fix.
I'm wondering about the .com.au/$1 segment.
Should that .au remain there, is that australia?
Much obliged,
michael -
Hey thanks everyone. Good info all around.
My client is hosting the site with a datarooms host, so I don't have access to the server settings. Tech support there seems clueless and "doesn't see a problem." I tried the DNS settings to no avail.
Is the only way to resolve this to get the dataroom host to modify htaccess?
thanks,
michael -
Sometimes the webhosting company has a feature for this. I have this as an option in dreamhost.com when I am setting up hosting for a specific domain.
Leave it alone: Both
http://www.compatibletonercartridge.net/
andhttp://compatibletonercartridge.net/
will work.
<label class="innerlabel">Add WWW: Makehttp://compatibletonercartridge.net/
redirect tohttp://www.compatibletonercartridge.net/
</label>
<label class="innerlabel">Remove WWW: Makehttp://www.compatibletonercartridge.net/
redirect tohttp://compatibletonercartridge.net/
</label> -
Using GWT is easy and a great help, but it doesn't affect the site itself when people come to the site from something other than Google, nor does it help the other search engines.
-
The most important and easiest step is to use Google Webmaster Tools. Go to Site Configuration, then Settings and set the Preferred Domain to either www or non-www.
Google:
"The preferred domain is the one that you would like used to index your site's pages. If you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we'll treat that link as if it was http://www.example.com."If you have access to .htaccess and not afraid to use it, then the following will help:
RewriteEngine on
RewriteCond %{HTTP_HOST} ^yourdomain.com [NC]
RewriteRule (.) http://www.yourdomain.com/$1 [L,R=301]
RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.*)index.html$ http://www.yourdomain.com.au/$1 [R=301,L]This redirects the non-www domain to the www version, and also redirects the index.html to the www version. Edit accordingly..
-
Hi Michael,
Let us know if you have questions about how to implement this. You will want to chose one version of your site, either www or non-www, and stick with that. Using Open Site Explorer to see whether www or non-www has the most backlinks can help with this decision.
Usually you do use an htaccess file to set this up, but sometimes you're on IIS or you don't have access to that file and need to work around it. If this is your case, let us know and we'll give you some help.
-
Yes, be careful with this.
Modify the .htaccess and redirect the example.com page to the www.example.com page.
More information: http://www.seomoz.org/learn-seo/redirection
Bye!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | | Digital-DMG0 -
Effect of temporary subdomains on the root domain
Hi all, A client of ours tends to have a number of offers and competitions during the year and would like to host these competitions on separate sub-domains. Once a competition is over (generally these last for approximately a month) the sub-domain gets deleted and pointed back to the main site. I was wondering whether this would have any effects on the root domain. Can I get your opinion please?
Technical SEO | | ICON_Malta0 -
Avoiding duplicate content on internal pages
Lets say I'm working on a decorators website and they offer a list of residential and commercial services, some of which fall into both categories. For example "Internal Decorating" would have a page under both Residential and Commercial, and probably even a 3rd general category of Services too. The content inside the multiple instances of a given page (i.e. Internal Decorating) at best is going to be very similar if not identical in some instances. I'm just a bit concerned that having 3 "Internal Decorating" pages could be detrimental to the website's overall SEO?
Technical SEO | | jasonwdexter0 -
Duplicate Page Content Lists the same page twice?
When checking my crawl diagnostics this morning I see that I have the error Duplicate page content. It lists the exact same url twice though and I don't understand how to fix this. It's also listed under duplicate page title. Personal Assistant | Virtual Assistant | Charlotte, NC http://charlottepersonalassistant.com/110 Personal Assistant | Virtual Assistant | Charlotte, NC http://charlottepersonalassistant.com/110 Does this have anything to do with a 301 redirect here? Why does it have http;// twice? Thanks all! | http://www.charlottepersonalassistant.com/ | http://http://charlottepersonalassistant.com/ |
Technical SEO | | eidna220 -
Duplicate content by category name change
Hello friends, I have several problems with my website related with duplicate content. When we changed any family name, for example "biodiversidad" to "cajas nido y biodiversidad", it creates a duplicate content because: mydomain.com/biodiversidad and mydomain.com/cajas-nido-y-biodiversidad have the same content. This happens every tame I change the names of the categories or families. To avoid this, the first thing that comes to my mid is a 301 redirect from the old to the new url, but I wonder if this can be done more automatically otherwise, maybe a script? Any suggestion? Thank you
Technical SEO | | pasape0 -
WordPress Duplicate Content Issues
Everyone knows that WordPress has some duplicate content issues with tags, archive pages, category pages etc... My question is, how do you handle these issues? Is the smart strategy to use robots meta and add no follow/ no index category pages, archive pages tag pages etc? By doing this are you missing out on the additional internal links to your important pages from you category pages and tag pages? I hope this makes sense. Regards, Bill
Technical SEO | | wparlaman0 -
Duplicate content across multiple domains
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain. My question relates to the removal of these pages. There are thousands of these duplicate pages. I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen? Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site. I suppose another option would be to include no cache meta tag for these pages. Any thoughts or comments would be appreciated.
Technical SEO | | jmsobe0