Is the www and non www isue realy seen by Google as duplicate content?
-
I realy don't understand how Google could posibly devaluate a link because the site displays the same content with www and without www. I mean did somebody recently saw a devaluation of a domain because of this isue? I somehow can not belive this because it is the standard when geting a new webspace that the new website display the same content with and without www.
Is a redirect realy necessary?
-
Google maay be able to work out what version you want to go with, but is it the same one that bing and other SE's will go with, and then you have the problem with www and non www links, one will be redirect to the other somehow and will leak a bit of link juice. its better that when some one copies your url its always the same.
I prefer the non www. because www is unessasry, i believe its an old unix thing, not needed today. If you have a long domain name www makes it just that much more confusing
-
Google is very good at figuring out that www and non www versions are the same site - so content duplication will not be an issue (this happens too often for them not to handle properly). One advantage you do have is consolidation of yoru link juice towards the same canonical version and therefore achieving better results. Set your preference in Google Webmaster Tools to a choice and stick to it - everywhere - even in your email signatures and printed material.
As far as www goes we've purposely dropped it and went with non-www, I personally think www is silly and meaningless however this means we have to from time to time police and correct how webmasters write down and link our URL and ask for www removal if found. Not too hard if you monitor yoru brand via Google Alerts.
-
Better have www. instead of without. Uniformity has always been an issue
-
Hi Michael,
Now a days Google is really Google at figuring out what version of the website you want to go with but with that said, isn't really that hard of a thing to fix. Â I'd say that as long as all your internal links are consistent in pointing to the same version, then you shouldn't have anything to worry about. Â In the long run of things, by making the redirect you won't see this huge bump in rankings but it is a standard practice that is done.
Casey
-
better safe then sorry.
I did look around for some time to get the answer to the same question and since no one could get a straight answer and even google webmaster tool has the option for ww or non www I think is better to get the 301 redirect.
Anyway - is just an opinion.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does hreflang restrain my site from being penalized for duplicated content?
I am curently setting up a travel agency website. This site is going to be targeting both american and mexican costumers. I will be working with an /es subdirectory. Would hreflang, besides showing the matching language version in the SERP´s,  restrain my site translated content (wich is pretty much the same) from being penalized fro duplicated content? Do I have to implement relcannonical? Thank ypu in advanced for any help you can provide.
On-Page Optimization | | kpi3600 -
Duplicate Page content | What to do?
Hello Guys, I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this: www.exemple.com/user/login?destination=node/125%23comment-form What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster? Thanks in advance! Pedro Pereira
On-Page Optimization | | Kalitenko20140 -
Www.colourbanners.co.uk/ & colourbanners.co.uk showing up as two seperate URLs - is this going to be dupliacte content issue?
Hi Guys, I have just created a report in Moz and there appears to be 91 duplicate content issues with the site which i need to fix as i think it could be the reason why we are suffering from a penalty. One of the main questions i have is these 3 variations of the URL http://www.colourbanners.co.uk/ http://colourbanners.co.ukhttp://colourbanners.co.uk/Each have links pointing to them. My question is, could this be causing a dupe issue?regardsGerry
On-Page Optimization | | gezzagregz0 -
Help With Duplicated Content
Hi Moz Community, I am having some issue's with duplicated content, i recently removed the .html from all of our links and moz has reported it as being duplicated. I have been reading up about Canonicalization and would to verify some details, when using the canonical tag would it be placed in the /mywebpage.html or /mywebpage file? I am having a hard time to sort this out so any help from you SEO experts would be great 🙂 I have also updated my htaccess file with the following Thanks in advance
On-Page Optimization | | finelinewebsolutions0 -
Duplicate Page Title
Wordpress Category pagination causes duplicate page title errors (ie. when there are so many posts in the category, it paginates them), is this a problem? Â Your tool is reporting it as a problem... Â but ProPhoto (my Wordpress provider say it is not a problem). Here are the 2 URL's with the same page title: http://www.lisagillphotography.co.uk/category/child-photography/ http://www.lisagillphotography.co.uk/category/child-photography/page/2/
On-Page Optimization | | LisaGill0 -
Duplicate Page Title
Not sure how to fix this. Â I am getting a duplicate page title for my main url, and the index page. Â I have attached an image. Â Thanks. 0RnG6.jpg
On-Page Optimization | | pixel830 -
Duplicate Content Warning
Hi Mozers, I have a question about the duplicate content warnings I am recieving for some of my pages. Â I noticed that the below pattern of URLs are being flagged as duplicate content. Â I understand that these are seen as two different pages but I would like to know if this has an negative impact on my SEO? Why is this happening? How do I stop it from happening? http://www.XXXX.com/product1234.html?sef_rewrite=1 http://www.XXXX.com/product1234.html Thanks in advance!
On-Page Optimization | | mozmonkey0 -
Can duplicate content issues be solved with a noindex robot metatag?
Hi all I have a number of duplicate content issues arising from a recent crawl diagnostics report. Would using a robots meta tag (like below) on the pages I don't necessarily mind not being indexed be an effective way to solve the problem? Thanks for any / all replies
On-Page Optimization | | joeprice0