Is the www and non www isue realy seen by Google as duplicate content?
-
I realy don't understand how Google could posibly devaluate a link because the site displays the same content with www and without www. I mean did somebody recently saw a devaluation of a domain because of this isue? I somehow can not belive this because it is the standard when geting a new webspace that the new website display the same content with and without www.
Is a redirect realy necessary?
-
Google maay be able to work out what version you want to go with, but is it the same one that bing and other SE's will go with, and then you have the problem with www and non www links, one will be redirect to the other somehow and will leak a bit of link juice. its better that when some one copies your url its always the same.
I prefer the non www. because www is unessasry, i believe its an old unix thing, not needed today. If you have a long domain name www makes it just that much more confusing
-
Google is very good at figuring out that www and non www versions are the same site - so content duplication will not be an issue (this happens too often for them not to handle properly). One advantage you do have is consolidation of yoru link juice towards the same canonical version and therefore achieving better results. Set your preference in Google Webmaster Tools to a choice and stick to it - everywhere - even in your email signatures and printed material.
As far as www goes we've purposely dropped it and went with non-www, I personally think www is silly and meaningless however this means we have to from time to time police and correct how webmasters write down and link our URL and ask for www removal if found. Not too hard if you monitor yoru brand via Google Alerts.
-
Better have www. instead of without. Uniformity has always been an issue
-
Hi Michael,
Now a days Google is really Google at figuring out what version of the website you want to go with but with that said, isn't really that hard of a thing to fix. I'd say that as long as all your internal links are consistent in pointing to the same version, then you shouldn't have anything to worry about. In the long run of things, by making the redirect you won't see this huge bump in rankings but it is a standard practice that is done.
Casey
-
better safe then sorry.
I did look around for some time to get the answer to the same question and since no one could get a straight answer and even google webmaster tool has the option for ww or non www I think is better to get the 301 redirect.
Anyway - is just an opinion.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any idea how Google is doing this? Is it schematic? http://techcrunch.com/2014/02/28/google-adds-full-restaurant-menus-to-its-search-results-pages/
Google is now showing menus on select searches. Any idea how they are getting this information? I would like to make sure my clients get visibility this way.
On-Page Optimization | | Ron_McCabe0 -
Duplicate Content - Category Pages 2+
I have my Wordpress SEO settings to deindex past page 1 of each category. However, Google Webmasters is selling me I have 210 pages with duplicate title tags. My site tanked last weekend and I don't know if it was Google Panda or what. I have been getting some fantastic backlinks and it seems like they just decided to disregard all of them as I am completely off the SERPs. Is this duplicate content a contributing factor? How can I get google to deindex my category pages past page 1? (I do need the first page to index as that does bring me organic traffic) Thanks.
On-Page Optimization | | 2bloggers0 -
Duplicate Content only an Issue on a Huge Scale?
To what extent is duplicate content an issue? We have a support forum with some duplicate content because users ask the same questions. The Moz reports we receive highlights our duplicate content and page title for our support forum as a "big" issue. I'm unsure to what extent it harms our SEO, and making the support section non-crawable would impair our level of support. It would be nice to know for sure if we should be concerned about this, and if yes, how can we do it differently? Thanks, I appreciate you help. -Allan
On-Page Optimization | | Todoist0 -
Copyscape Duplicate Content Ownership Question
We have a site that has had its content copied verbatim to numerous other sites and articles. We were advised to change our content but the content is originally ours. Does google take that into account before they apply duplicate penalties? And shouldn't copyscape be able to show this information in their reports? It just doesnt seem right that the originating author would have to change content because everyone else is stealing it. Any clarification on this?
On-Page Optimization | | anthonytjm0 -
Duplicate content issue in SEOmoz campaign.
Hi, We are running a campaign for a website in SEOmoz. We get a dup content issue warning: http://www.oursite.com and http://www.oursite.com/ are being seen as 2 different urls. Only difference among 2 urls is the trailing slash at the end of the second url. Why is this happening? I was aware of www vs non www but never heard of an issue related to the slash. Thanks for your help!
On-Page Optimization | | gerardoH1 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0 -
Has anyone noticed a big delta between Google and Bing rankings? For example, we rank favorably in Google, but not so favorably in Bing. Are there different tactics I should use to rank better in Bing?
An example is in Google, we currently rank #1 & #2 for "yoga pants" for Athleta and Old Navy. In Bing, I'm on page 2. Any thoughts here?
On-Page Optimization | | kpr0