Www vs non www and understanding opensite
-
Hi Guys,
New guy here with some questions regarding the difference between www and non www.
I am helping with a site at the moment and gradually working my way through bits and learning all the time. I was watching one of the seomoz videos and it brought my attention back to www vs non www.
I understand that google will treat these as two seperate sites but wanted to check what the stats are telling me.
I was under the impression that www.mydummysite.com was getting most links etc as this is what I have always used. However when I used Opensite explorer it told me something different as follows:
32/100 29/100 5 16
32/100 29/100 2 1,500
Am i correct in saying that i should be adding a redirect from www.mydummysite.com to mydummysite.com ???? I am thinking that this is telling me that I am potentially missing out on 1,500 links to my site but it could mean I am missing out on just 16. Eitherway I guess its something I should fix right?
Do I just redirect that page or would all pages beneith it such as mydummysite.com/news also need redirect???
Can i use Canonical Rel links for this now?
Thanks for taking the time to read and reply!
-
Excellent, thank you for taking the time to come back to me.
(*note for the admin... just tried to reply to this via my HTC Desire and clicking this text box does not bring up the keypad on the phone. Not sure if this is a phone issue or the site not being recognised as a text box.)
-
My recommendation is....ignore everything and decide which URL version you prefer and go with that. Why? Because you have less then 10 linking domains and, in the long term, your site should grow and hopefully end up with hundreds of linking domains.
Also consider, you lose only a small amount of link juice (estimated between 1-10%) when you redirect a link. Of course you don't want to lose any, but you only have a few links and hopefully you will grow to many thousands.
The moz rank and trust factors will adjust after your URLs are redirected. Consolidating your linking power is the primary reason why this issue needs to be addressed.
If I was in your situation I would choose the "www" domain and update the links on your company's website. If you wish to go with the "non-www" site, that is perfectly fine as well. The important factor is that you make a decision, not which one you make.
-
Great that was what i thought also which means I am learning!
The 1500 links are coming from another site owned by the company and I was going to say that the Linking Root domains are higher on the www site so I would probably go with that. However, I just check the Full List of Link Metrics and it shows :
Moz Rank 3.86
Moz Trust 5.59
Moz Rank 4.49
Moz Trust 5.68
Am i therefore correct in saying that mydummysite.com would be the better one to go with?
Sorry for the silly quesions, your help is appreciated. Its nearly 12pm here I should go to bed!
-
If those were 1500 links from 150 domains, I would definitely chose the URL version that was receiving those links. Since those links are only from 2 domains, they don't really weigh that much.
Who are the 1500 links from? Based on the information you shared it seems most likely those links are either from another site you control, or someone who is very friendly with your site. In either case, it should be a simple matter to update those links.
Your site's ranking on any given keyword can possibly change with any change you make to your site. There are too many unknown variables to even take a guess, other then for me to say I wouldn't expect this change to have any immediate impact to your rankings.
-
Thanks for reply Ryan.
That helps a lot.
Am i correct in saying that I should select the site with 1500 links? Seems like an obvious question.
I guess I am really trying to work out whether my current site rank and position can be improved on with a simple change and understanding the stats.
-
Hi Wayne.
The first step is for you to decide which version of your site is preferred, the "www" or "non-www" version. You are correct in checking existing links to your site and using this information as a factor in making a decision.
The 2 / 1500 figure for your mydummysite.com URL shows there are 2 domains which offer 1500 links to your site. I am going to guess that these are your sites, and you have a footer link or something similar which shows up on every page of the site.
If these links are under your control, you can update them to the "www" version of your site if that is what you preferred. Ultimately you need to make a decision one way or the other, and then stick with it.
The next step would be placing a redirect on your webserver to all traffic is redirected to your chosen domain. If you chose www.mydummysite.com, then whenever you enter "mydummysite.com" the address should automatically be changed to www.mydummysite.com once the redirect has been correctly set up. You can use seomoz.org as an example of how that should work.
The method of redirecting your site will vary based on software (such as Microsoft vs Linux servers) and hardware (such as Apache vs Lightspeed servers). If you are not familiar with how to make this change, contact your web host and they should be able to assist you.
This is a very common change which can easily be made. The single redirection will work for your entire site once it is properly configured. You could use canonical tags as well, but the 301 redirect process mentioned above would be the best solution.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Resolving 301 Redirect Chains from Different URL Versions (http, https, www, non-www)
Hi all, Our website has undergone both a redesign (with new URLs) and a migration to HTTPS in recent years. I'm having difficulties ensuring all URLs redirect to the correct version all the while preventing redirect chains. Right now everything is redirecting to the correct version but it usually takes up to two redirects to make this happen. See below for an example. How do I go about addressing this, or is this not even something I should concern myself with? Redirects (2) <colgroup><col width="123"><col width="302"></colgroup>
Technical SEO | | theyoungfirm
| Redirect Type | URL |
| | http://www.theyoungfirm.com/blog/2009/index.html 301 | https://theyoungfirm.com/blog/2009/index.html 301 | https://theyoungfirm.com/blog/ | This code below was what we added to our htaccess file. Prior to adding this, the various subdomain versions (www, non-www, http, etc.) were not redirecting properly. But ever since we added it, it's now created these additional URLs (see bolded URL above) as a middle step before resolving to the correct URL. RewriteEngine on RewriteCond %{HTTP_HOST} ^www.(.*)$ [NC] RewriteRule ^(.*)$ https://%1/$1 [R=301,L] RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L] Your feedback is much appreciated. Thanks in advance for your help. Sincerely, Bethany0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Categories in Places Vs Local
Say you are listed with both Google places and Google Local. Places still allows custom categories, while Local limits you to preset categories. Which is the better strategy: to build service pages following custom services available in Places, or build out service pages following the (allowed) preset categories in Local.
Technical SEO | | waynekolenchuk0 -
Wordpress Redirect Plugin Vs Manual .htaccess?
Hi everyone, I need to 301 redirect my old pages to new ones but i am confused between whether to choose plugin for this or i should manually rewrite the code on .htaccess file. Please give your suggestion and if you think i should use plugin then which one?
Technical SEO | | himanshu3019890 -
Duplicate Content based on www.www
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www. I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www. Does anyone know of any other things that may cause this and how we can go about remedying it?
Technical SEO | | CredA0 -
Google Not Indexed WWW name
Here is my domain - http://www.plugnbuy.com . When i see through "site" google not showing with WWW index but the same when i do without WWW.. it is showing in search. So yesturday i changed the setting from GWM to preferred domain as a WWW appear but today still not showing anything... Please help..
Technical SEO | | mamuti0 -
Duplicate Homepage: www.mysite.com/ and www.mysite.com/default.aspx
Hi, I have a question regarding our client's site, http://www.outsolve-hr.com/ on ASP.net. Google has indexed both www.outsolve-hr.com/ and www.outsolve-hr.com/default.aspx creating a duplicate content issue. We have added
Technical SEO | | flarson
to the default.aspx page. Now, because www.outsolve-hr.com/ and www.outsolve-hr.com/default.aspx are the same page on the actual backend the code is on the http://www.outsolve-hr.com/ when I view the code from the page loaded in a brower. Is this a problem? Will Google penalize the site for having the rel=canonical on the actual homepage...the canonical url. We cannot do a 301 redirect from www.outsolve-hr.com/default.aspx to www.outsolve-hr.com/ because this causes an infinite loop because on the backend they are the same page. So my question is two-fold: Will Google penalize the site for having the rel=canonical on the actual homepage...the canonical url. Is the rel="canonical" the best solution to fix the duplicate homepage issue on ASP. And lastly, if Google has not indexed duplicate pages, such as https://www.outsolve-hr.com/DEFAULT.aspx, is it a problem that they exist? Thanks in advance for your knowledge and assistance. Amy0