Is the www and non www isue realy seen by Google as duplicate content?
-
I realy don't understand how Google could posibly devaluate a link because the site displays the same content with www and without www. I mean did somebody recently saw a devaluation of a domain because of this isue? I somehow can not belive this because it is the standard when geting a new webspace that the new website display the same content with and without www.
Is a redirect realy necessary?
-
Google maay be able to work out what version you want to go with, but is it the same one that bing and other SE's will go with, and then you have the problem with www and non www links, one will be redirect to the other somehow and will leak a bit of link juice. its better that when some one copies your url its always the same.
I prefer the non www. because www is unessasry, i believe its an old unix thing, not needed today. If you have a long domain name www makes it just that much more confusing
-
Google is very good at figuring out that www and non www versions are the same site - so content duplication will not be an issue (this happens too often for them not to handle properly). One advantage you do have is consolidation of yoru link juice towards the same canonical version and therefore achieving better results. Set your preference in Google Webmaster Tools to a choice and stick to it - everywhere - even in your email signatures and printed material.
As far as www goes we've purposely dropped it and went with non-www, I personally think www is silly and meaningless however this means we have to from time to time police and correct how webmasters write down and link our URL and ask for www removal if found. Not too hard if you monitor yoru brand via Google Alerts.
-
Better have www. instead of without. Uniformity has always been an issue
-
Hi Michael,
Now a days Google is really Google at figuring out what version of the website you want to go with but with that said, isn't really that hard of a thing to fix. I'd say that as long as all your internal links are consistent in pointing to the same version, then you shouldn't have anything to worry about. In the long run of things, by making the redirect you won't see this huge bump in rankings but it is a standard practice that is done.
Casey
-
better safe then sorry.
I did look around for some time to get the answer to the same question and since no one could get a straight answer and even google webmaster tool has the option for ww or non www I think is better to get the 301 redirect.
Anyway - is just an opinion.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Rich Snippet
So i have been implementing rich snippets for work and all has been good until now, As you can see below the meta description has all of a sudden included the review date. The review date is the only date on the page. Any ideas what could be causing this? Thanks wqLKKl9
On-Page Optimization | | David-McGawn0 -
Is it better to create more pages of content or expand on current pages of content?
I am assuming that one way of improving the rankings of current pages will be to create more content on the keywords used... should this be an expansion of the content on current pages I am optimising for a keyword or is it better to keep creating new pages and if we are creating new pages is it best to use an extension of the keyword on the new page – for example if we are optimising one page for ‘does voltage optimisation work’ would it then be worth creating a page optimised for ‘does voltage optimisation work in hotels’ for example and so on? I am guessing maybe both might help, this is just a question I have had from one of my clients.
On-Page Optimization | | TWSI1 -
How Can I Fix Adobe Bridge Photo Galleries and Duplicate Content?
I have used the Adobe bridge program for a number of photo galleries on a remodeling site and it is showing a large amount of duplicate titles, etc. Is there an easy fix to this? anyone?
On-Page Optimization | | DaveBrown3330 -
Duplicate Content Issues with Forum
Hi Everyone, I just signed up last night and received the crawl stats for my site (ShapeFit.com). Since April of 2011, my site has been severely impacted by Google's Panda and Penguin algorithm updates and we have lost about 80% of our traffic during that time. I have been trying to follow the guidelines provided by Google to fix the issues and help recover but nothing seems to be working. The majority of my time has been invested in trying to add content to "thin" pages on the site and filing DMCA notices for copyright infringement issues. Since this work has not produced any noticeable recovery, I decided to focus my attention on removing bad backlinks and this is how I found SEOmoz. My question is about duplicate content. The crawl diagnostics showed 6,000 errors for duplicate page content and the same for duplicate page title. After reviewing the details, it looks like almost every page is from the forum (shapefit.com/forum). What's the best way to resolve these issues? Should I completely block the "forum" folder from being indexed by Google or is there something I can do within the forum software to fix this (I use phpBB)? I really appreciate any feedback that would help fix these issues so the site can hopefully start recovering from Panda/Penguin. Thank you, Kris
On-Page Optimization | | shapefit0 -
Duplicate content
Hello, I have two pages showing dulicate content. They are: http://www.cedaradirondackchairs.net/ http://www.cedaradirondackchairs.net/index Not sure how to resolve this issue. Any help would be greatly appreciated! Thanks.
On-Page Optimization | | Ronb10230 -
Not using H1's with keywords to simulate natural non SEO'd content?
There has been a lot of talk lately about making a website seem like it is not SEO'd to avoid over optimization penalties with the recent Google Algorithmic updates. Has anyone come across the practice of not using Headings (H1's, H2's etc..) properly to simulate that the current webpage isn't over optimized? I've come across a site that used to use multiple keywords within their headings & now they are using none. In fact they are marking their company name & logo as an H1 and non keyworded H2's such as our work or Contact. Is anyone holding back on their old SEO tactics to not seem over optimized to Google? Thanks!
On-Page Optimization | | DCochrane0 -
How do I avoid duplicate content and page title errors when using a single CMS for a website
I am currently hosting a client site on a CMS with both a Canadian and USA version of the website. We have the .com as the primary domain and the .ca is re-directed from the registrar to the Canadian home page. The problem I am having is that my campaign produces errors for duplicate page content and duplicate page titles. Is there a way to setup the two versions on the CMS so that these errors do not get produced? My concern is getting penalized from search engines. Appreciate any help. Mark Palmer
On-Page Optimization | | kpreneur0 -
Where does Google say this?
Just came across this article: http://www.searchmarketingstandard.com/tips-for-avoiding-thin-content And, it states, "Google says that it will ignore pages with less than 200 words of body text " I submitted a comment to the author, but was wondering in the meantime if anyone knows where Google says this?
On-Page Optimization | | nicole.healthline0