.htaccess and www - non www
-
Recently I have taken over a website and I made a pretty colossal mistake. The site was properly constructed via .htaccess to a www domain. Typically I roll without it and I made a bad assumption that the .htaccess was not previously set correctly because there were hundreds of fundamental mistakes.
After a couple of days I noticed the mistake but some of our new (non www) have picked up some solid links etc. So now I feel that I am in a nightmare of creating redirects etc. So should I switch back to WWW or not? Does it matter at this point?
-
Thanks for the reply. There is no duplicate data. I am very confident about this and I have properly constructed 301's to the correct Canonical link. I should have laid the question out a bit better. I am clearly defined through and through to the NON-WWW address but most all of the inbound links are to the WWW link and have a 301 to them. It appears as if I am missing the link juice from the links.
-
either way is ok, but not both.
at least when you have 301 redirect, people that grab your url will always grab the same version of it -
Thanks guys. I was pretty sure that is what I had to do but it is always so helpful in a predicament to get expert opinions.
Thanks again!!
-
Barry said it all.
I recommend you 301-redirect all pages without "www." to the fully qualified URLs containing "www.".
You'll get 90% of the linkjuice of existing backlinks, so just go for it. -
You almost certainly want to define it one way or the other at some point.
Yes, you'll lose a little bit of power through the redirects once you've done them, but at least afterwards you can be sure all the links are going to point to the right place.
Are you better having 90+% to one or 50% to each is the question
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.htaccess probelem causing 605 Error?
I'm working on a site, it's just a few html pages and I've added a WP blog. I've just noticed that moz is giving me the following error with reference to http://website.com: (webmaster tools is set to show the www subdomain, so it appears OK). Error Code 605: Page Banned by robots.txt, X-Robots-Tag HTTP Header, or Meta Robots Tag Here's the code from my htaccess, is this causing the problem? RewriteEngine on
Technical SEO | | Stevie-G
Options +FollowSymLinks
RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://www.website.com/$1 [R=301,L]
RewriteCond %{THE_REQUEST} ^./index.php
RewriteRule ^(.)index.php$ http://www.website.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^website.com$ [NC]
RewriteRule ^(.*)$ http://www.website.com/$1 [R=301,L] BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress Thanks for any advice you can offer!0 -
After I 301 redirect duplicate pages to my rel=canonical page, do I need to add any tags or code to the non canonical pages?
I have many duplicate pages. Some pages have 2-3 duplicates. Most of which have Uppercase and Lowercase paths (generated by Microsoft IIS). Does this implementation of 301 and rel=canonical suffice? Or is there more I could do to optimize the passing of duplicate page link juice to the canonical. THANK YOU!
Technical SEO | | PFTools0 -
Numerous 404 errors on crawl diagnostics (non existent pages)..
As new as them come to SEO so please be gentle.... I have a wordpress site setup for my photography business. Looking at my crawl diagnostics I see several 4xx (client error) alerts. These all show up to non existent pages on my site IE: | http://www.robertswanigan.com/happy-birthday-sara/109,97,105,108,116,111,58,104,116,116,112,58,47,47,109,97,105,108,116,111,58,105,110,102,111,64,114,111,98,101,114,116,115,119,97,110,105,103,97,110,46,99,111,109 | Totally lost on what could be causing this. Thanks in advance for any help!
Technical SEO | | Swanny8110 -
.htaccess: Multiple URLs catches filename
Hi, I have the following line in my .htaccess:
Technical SEO | | rasmusbang
RewriteRule privacy stdpage.php?slug=privacy [L] So if you go to the www.mysite.com/privacy it takes the stdpage.php with the argument above. But if you go to www.mysite.com/privacysssssssss catches the same file. How can I prevent this? It will give me multiple URLs with the exact same content. I have a 404 page which i would like to show instead when the match is not 100%. -Rasmus0 -
Mod Rewrite / .htaccess avoid duplicate content
I have been searching and testing for hours but cannot find a solution. I am able to get a URL to display with out the file exntension. i.e domain.com/file instead of domain.com/file.php The problem is both versions of the URL above work, therefore a duplicate content issue. How can I force the URL with the file extension not to resolve and give a 404 error? Or just redirect to the non extension URL? IF it helps here is my code. Options +FollowSymLinks
Technical SEO | | MiamiWebCompany
RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.+)$ $1.php [L,QSA]0 -
Redirect non-www if using canonical url?
I have setup my website to use canonical urls on each page to point to the page i wish Google to refer to. At the moment, my non-www domain name is not redirected to www domain. Is this required if i have setup the canonical urls? This is the tag i have on my index.php page rel="canonical" href="http://www.mydomain.com.au" /> If i browse to http://mydomain.com.au should the link juice pass to http://www.armourbackups.com.au? Will this solve duplicate content problems? Thanks
Technical SEO | | blakadz0 -
Should I use www. or not in my main URL?
I have backlinks coming into my homepage, which has both a www. URL and one that's merely http://mysite.com. Which is the preferred URL for best optimization for search engines and how do I find this out?
Technical SEO | | NetPicks0 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0