Help With Preferred Domain Settings, 301 and Duplicate Content
-
I've seen some good threads developed on this topic in the Q&A archives, but feel this topic deserves a fresh perspective as many of the discussion were almost 4 years old.
My webmaster tools preferred domain setting is currently non www. I didn't set the preferred domain this way, it was like this when I first started using WM tools.
However, I have built the majority of my links with the www, which I've always viewed as part of the web address.
When I put my site into an SEO Moz campaign it recognized the www version as a subdomain which I thought was strange, but now I realize it's due to the www vs. non www preferred domain distinction.
A look at site:mysite.com shows that Google is indexing both the www and non www version of the site. My site appears healthy in terms of traffic, but my sense is that a few technical SEO items are holding me back from a breakthrough.
QUESTION to the SEOmoz community:
What the hell should I do? Change the preferred domain settings? 301 redirect from non www domain to the www domain?
Google suggests this: "Once you've set your preferred domain, you may want to use a 301 redirect to redirect traffic from your non-preferred domain, so that other search engines and visitors know which version you prefer."
Any insight would be greatly appreciated.
-
The worst thing you can do is nothing.
Above is 5 examples of URLs which COULD all lead to the same page. There are numerous other possibilities as well. If you don't let Google know which version of the page is correct, then you will suffer the consequences of duplicate content.
What happens is Google doesn't know which page is correct. They will pick one of the non-www versions because that is what your Google WMT is set up to do. Meanwhile other versions of the pages are being used.
You are sending your link juice to a page, but it is a complete waste as it is not being considered by Google for SERP. You MUST resolve this issue if you care about SEO at all.
-
Thanks Ryan. So, if most of the links (including all internal links) are built with the www format then it is wise to change preferred domain settings to www and redirect the non www to the www domain?
Am I likely to damage rankings/traffic by doing this? What happens if I just leave it as is?
-
You are welcome to do so. Go to Google WMT, change your current option to the www, then adjust your .htaccess file as Steven suggested.
Also, canonicalize your pages to help ensure this issue can't happen again. Your .htaccess changes will work as long as the file is there, but things happen so it's better to be covered.
-
Guys,
Thanks for the input. I just want to do what is best for traffic and the site. I don't want to do anything that is going to tank my rankings and visitors.
I don't get alot of type in traffic.
www is the main way the links have been built, why not just redirect those to the non www version?
-
As Ryan said, make a decision. The easiest way to make sure either of your decisions sticks is to use an htaccess file and rewrite to your preferred.
If using the www version:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^[0-9]+(.[0-9]+){3} [OR]
RewriteCond %{HTTP_HOST} ^mydomain.com [NC]
RewriteRule ^(.*)$ http://www.mydomain.com/$1 [L,R=301]if using the non www version:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.mydomain.com [NC]
RewriteRule ^(.*)$ http://mydomain.com/$1 [L,R=301]A few other questions to keep in mind:
Do you get a lot of type-in traffic?
Do they tend to type the www?
In the SERP it is easier to read the domain name with out the www if looking for a specific domain name. Do you have a brand built where people just say your domain name?
-
You need to make a decision. Do you want your site address to be seen with or without the www?
Try to assess which version of your URL would require the least number of re-directs. You mentioned the links you built mostly include the www. Take a look at all of your links. You may have a higher number of organic links without the www. Evaluate all the links, then make a decision.
Once you make a decision, stick with it. Canonicalize all your pages with the correct version of the URL. Search your site for all internal links and standardize them.
While you are on this project standardize whether you use a "/" on the end of your url as well. www.mysite.com is not the same as www.mysite.com/. I make this suggestion because if you will go through the painful process of standardizing your site for the www issue, you should resolve all issues at once.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on product pages
Hi, We are considering the impact when you want to deliver content directly on the product pages. If the products were manufactured in a specific way and its the same process across 100 other products you might want to tell your readers about it. If you were to believe the product page was the best place to deliver this information for your readers then you could potentially be creating mass content duplication. Especially as the storytelling of the product could equate to 60% of the page content this could really flag as duplication. Our options would appear to be:1. Instead add the content as a link on each product page to one centralised URL and risk taking users away from the product page (not going to help with conversion rate or designers plans)2. Put the content behind some javascript which requires interaction hopefully deterring the search engine from crawling the content (doesn't fit the designers plans & users have to interact which is a big ask)3. Assign one product as a canonical and risk the other products not appearing in search for relevant searches4. Leave the copy as crawlable and risk being marked down or de-indexed for duplicated contentIts seems the search engines do not offer a way for us to serve this great content to our readers with out being at risk of going against guidelines or the search engines not being able to crawl it.How would you suggest a site should go about this for optimal results?
Intermediate & Advanced SEO | | FashionLux2 -
Contextual FAQ and FAQ Page, is this duplicate content?
Hi Mozzers, On my website, I have a FAQ Page (with the questions-responses of all the themes (prices, products,...)of my website) and I would like to add some thematical faq on the pages of my website. For example : adding the faq about pricing on my pricing page,... Is this duplicate content? Thank you for your help, regards. Jonathan
Intermediate & Advanced SEO | | JonathanLeplang0 -
Duplicate page content errors stemming from CMS
Hello! We've recently relaunched (and completely restructured) our website. All looks well except for some duplicate content issues. Our internal CMS (custom) adds a /content/ to each page. Our development team has also set-up URLs to work without /content/. Is there a way I can tell Google that these are the same pages. I looked into the parameters tool, but that seemed more in-line with ecommerce and the like. Am I missing anything else?
Intermediate & Advanced SEO | | taylor.craig0 -
Can a website be punished by panda if content scrapers have duplicated content?
I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening? I'd really appreciate any help as I can't find the answer online!
Intermediate & Advanced SEO | | RG_SEO0 -
Duplicate Content
Hi, So I have my great content (that contains a link to our site) that I want to distribute to high quality relevant sites in my niche as part of a link building campaign. Can I distribute this to lots of sites? The reason I ask is that those sites will then have duplicate content to all the other sites I distribute the content to won;t they? I this duplication bad for them and\or us? Thanks
Intermediate & Advanced SEO | | Studio330 -
WMT Index Status - Possible Duplicate Content
Hi everyone. A little background: I have a website that is 3 years old. For a period of 8 months I was in the top 5 for my main targeted keyword. I seemed to have survived the man eating panda but not so sure about the blood thirsty penguin. Anyway; my homepage, along with other important pages, have been wiped of the face of Google's planet. First I got rid of some links that may not have been helping and disavowed them. When this didn't work I decided to do a complete redesign of my site with better content, cleaner design, removed ads (only had 1) and incorporated social integration. This has had no effect at all. I filed a reconsideration request and was told that I have NOT had any manual spam penalties made against me, by the way I never received any warning messages in WMT. SO, what could be the problem? Maybe it's duplicate content? In WMT the Index Status indicates that there are 260 pages indexed. However; I have only 47 pages in my sitemap and when I do a site: search on Google it only retrieves 44 pages. So what are all these other pages? Before I uploaded the redesign I removed all the current pages from the index and cache using the remove URL tool in WMT. I should mention that I have a blog on Blogger that is linked to a subdomain on my hosting account i.e. http://blog.mydomain.co.uk. Are the blog posts counted as pages on my site or on Blogger's servers? Ahhhh this is too complicated lol Any help will be much appreciated! Many thanks, Mark.
Intermediate & Advanced SEO | | Nortski0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
Multiple domains expiring that have 301 redirects to my primary domain. Am I in trouble?
I recently took on the SEO of a large website with http://example.com. My predecessor bought 40 plus domains for specific cities like Jacksonvilleexample.com, Miamiexample.com, etc. ZERO of the additional domains linked to our main website. The domains that were bought basically had our exact same website in terms of content, links etc that mirrored our main http://example.com. I added 301 redirects to help problems that may be a result of this type of structure. Some of the additional domains were indexed and some were not but all have 301's and as far as traffic is concerned I'm not worried about loosing short term traffic. My question: All the domains are set to expire in June and I don't want to continue to have them 301 redirected to my main domain (example.com). I'm not trying to avoid the additional cost of all the domains but I don't see an advantage to having them so CAN letting all these domains expire hurt me from a long term SEO position if I don't renew them?
Intermediate & Advanced SEO | | ballanrk0