www vs no-www duplicate which should I use
-
site is no-www
I caught this in archives.
Will this by my fix?
Mike Davis Online Marketing Manager at McKesson
- May 22, 2013
Easy fix:
in your .htaccess file, use this
RewriteEngine On
RewriteCond %{HTTP_HOST} !^domain.com
RewriteRule (.*) http://domain.com/$1 [R=301,L]Remember to replace domain.com with your domain name.
Enjoy! -
touristips
You can use either without any negative effects whatsoever. The .htaccess above is correct for non www. Also, in GWMT go in and select a preferred domain as a backup. So, since you are using non www, go to GWMT and top right corner you will see the tools wheel. Click it and go to site settings. Then select the non www.
This will insure all is correct.
Best
-
I would go for www. There are some technical reasons why you would want www's.
I don't know the full reason, but the way some hosts configure their caching, and cpanel cloudflare integration you may be forced to use www's if you want to take advantage of that. Take a popular host like Siteground for instance, if you wanted to use their supercacher and cloudflare you must use www's.
Sorry I cannot provide clear technical reasons, as I don't fully understand it, but I have had to deal with this in the past. Prior to running into these issues, I used to think it didn't matter and had all my sites with the non-www's. Its not a problem on the whole, but you should be aware in a very small number of cases it might be (of course you can just avoid any problems by avoiding that particular configuration, it just might be annoying thats all).
-
doesnt matter which one you go for so long as you stick with it to be consistent
most people these days seem to be going for non www option
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to explain to a client that duplicate content is bad...
Afternoon! An SEO client of ours has copied a load of landing/category page content from other sites. Lots of emails have been sent back and forth asking them to remove it, but they are adamant to keep it up there until we have time to amend it. We have explained to them: The Google penalty risks The copyright risks The short and long-term implications for their brand new business/website The money they are spending on our SEO package could be completely wasted if they're caught I think the above is pretty black and white, but the director of this company will not budge. Does anyone have any different approaches? The director said he's happy for us to amend the content but, in the meantime, the plagiarised content will not be removed. Cheers, Lewis
On-Page Optimization | | PeaSoupDigital0 -
Which speed test to use?
Hi so I have a very easy question I think. I am too new to seo stuff to know better, but which speed test site is the best? I have a construction website. My issue is I am getting mixed results. Pingdom is showing <dl> <dt>Load time</dt> <dd>1.18s</dd> </dl> <dl> <dt>Requests</dt> <dd>22</dd> </dl> <dl class="last"> <dt>Perf. grade</dt> <dd>94/100</dd> <dd>Gt metrix is showing 98/87 with a 2.13 load time.</dd> <dd>And it varies from when I do it.</dd> <dd>So which do I go off of?</dd> <dd>With pingdom, I feel pretty good.</dd> <dd>With gtmetrix I feel like maybe I could improve.</dd> <dd>I ask because I have been looking into a cdn, but because its like learning a foreign language to me, I do not want to go there unless I have too.</dd> <dd>Thank you for any advice on which speed test to view as most accurate</dd> <dd>Chris</dd> </dl>
On-Page Optimization | | asbchris0 -
Duplicate Content Issues with Forum
Hi Everyone, I just signed up last night and received the crawl stats for my site (ShapeFit.com). Since April of 2011, my site has been severely impacted by Google's Panda and Penguin algorithm updates and we have lost about 80% of our traffic during that time. I have been trying to follow the guidelines provided by Google to fix the issues and help recover but nothing seems to be working. The majority of my time has been invested in trying to add content to "thin" pages on the site and filing DMCA notices for copyright infringement issues. Since this work has not produced any noticeable recovery, I decided to focus my attention on removing bad backlinks and this is how I found SEOmoz. My question is about duplicate content. The crawl diagnostics showed 6,000 errors for duplicate page content and the same for duplicate page title. After reviewing the details, it looks like almost every page is from the forum (shapefit.com/forum). What's the best way to resolve these issues? Should I completely block the "forum" folder from being indexed by Google or is there something I can do within the forum software to fix this (I use phpBB)? I really appreciate any feedback that would help fix these issues so the site can hopefully start recovering from Panda/Penguin. Thank you, Kris
On-Page Optimization | | shapefit0 -
Crawl Diagnostics - Duplicates and canonical problem
SEOmoz crowl diagnostic reports duplicates (title, content) issue on this addres: http://www.meblobranie.pl/biurowe/fotele-biurowe/promocje page already has canonical tag - is this a bug of crowler, or smth wrong on page?
On-Page Optimization | | SITS0 -
The best way to redirect to www
What is the best way to redirect non www to www. I saw a lot of the solution. is this one ok? RewriteCond %{HTTP_HOST} !^www.seomoz.org [NC] RewriteRule (.*) http://www.seomoz.org/$1 [L,R=301]
On-Page Optimization | | bruki0 -
Keywords vs Tags In Wordpress
Whats the difference between Keywords vs Tags In Wordpress? Also sine meta keywords don't matter in a website do they matter on a blog?
On-Page Optimization | | splashmedia0 -
Percentage of duplicate content allowable
Can you have ANY duplicate content on a page or will the page get penalized by Google? For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse? If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content? thanks!
On-Page Optimization | | sportstvjobs0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5