How do fix twin home pages
-
Search engine analysis is indicating that my site has twin home pages (www.mysite.com and http://mysite.com).
The error message I'm getting is: "your website resides at both www.mysite.com and mysite.com.
My uploaded index page is a .htm page (not .html). I don't know if that matters.
Can someone explain how this happened and what I can do to fix it?
Thanks!
-
Hi FinalFrontier,
I agree with setting up a 301 redirect to a single version. I also recommend doing the following:
- Set up canonical URLs to your desired version
- Ensure that your XML sitemaps use your desired version
- Add both www and non-www to Google Webmaster Tools and select one as the URL you'd like displayed in search results
Best of luck!
Chris
-
If you look at the redirect code the webhost provided in their instructions, I notiched there is not a [NC] at the end of the Rewrite Cond line. I'm not sure if that [NC] is necessary or not.
Other than that and the possible time-lag you speak of, I'm at a loss.
-
It could just be a time-lag in our data (and that wouldn't shock me), but run a header checker and make sure the 301 is working properly. For example, try this:
-
Well, this isn't making any sense.
I made the following change to my .htaccess file - followed the instructions given my my web host:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^mysite.com
RewriteRule (.*) http://www.mysite.com/$1 [R=301,L]
Then I ran another seoMoz root crawl a couple hours later and it still said I had the same errors on my home page (duplicate home page content and titles).
I just checked my .htaccess file again and it did save those 301 redirect changes. So why am I still getting duplicate page errors? thx.
-
Yeah, it sounds like you're not currently having major issues. I think it's good to prevent these issues (and duplicates are a real concern), but you can ease into this one, I strongly suspect.
-
Thanks for your post.
Google is indexing all my www pages (including www.mysite.com), but (I guess this is good news?) no documents show up for the:
site:mysite.com -url:www
in Google.
-
Since this issue can occur site-wide, I do tend to agree with Anton that 301-redirects are a better solution for this particular problem (although canonical tags will work, if that's your only feasible option). It is important, as implied in the comments, to make sure hat your internal links are consistent and you aren't using both versions in your site (although, with "www" vs. non-www, that's pretty rare).
Practically, it depends a lot on the size of your site, whether you have links to both versions, and whether Google has indexed both version. This is a problem in theory, but it may not currently be a problem on your site. You can check the indexed pages of both the root domain and www subdomain separately in Google with these commands:
site:mysite.com inurl:www
site:mysite.com -inurl:www
(the first pulls up anything with "www", and the second only pages without it).
If you're seeing both in play, then sorting out how to do the 301-redirects is a good bet. If you're not, then it's still a solid preventive measure, but you don't need to panic.
-
It can have a pretty major impact on search rankings. Basically what's happening is you have two identical pages for every intended page on your site. So it creates duplicate content issues.
So for example...
Someone finds something on your site that they like at www.yoursite.com/example/ and links to it from their site or shares it on Twitter, which increases the ranking power for that page.
Another person finds the same content at yoursite.com/example/ and links to it as well.
Instead of consolidating all the benefits of links to your site onto a single page, you're basically reducing your ranking potential by 50%.
-
How big of an issue is this for search engines? I'm indexed in Bing, Google, Yahoo.
I'm curious as to how big (or small) an impact this really has on a website.
thx.
-
Hi Final Frontier,
Most hosting providers will likely add this to your .htaccess file for you if you contact technical support. I know HostGator will happily provide that kind of help. If not, I'd be glad to add the lines if you'll download the file and email it to me.
-
Thanks but I'm more confused now than ever and I don't know how to change a .htaccess file, so I don't want to turn this into a DYI project and screw things up even more. I get the gist of what the problem is.
All my internal pages link back to www.mysite.com and to www.mysite.com/pages.htm throughout the site.
However, I noticed that for a img src for a facebook page (external link in my site), I am mistakenly linking that to http://mysite.com/facebook (no www). So I'll at least fix that to include www so there's consistency. Not sure if that's related to the problem - there are not other pages I've seen that link to http://mysite.com instead of www.mysite.com.
I've learned a lot here, but this is one technical thing I don't want to do myself and make things worse.
-
From: http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not
There is usually a better solution
The canonical tag is not a replacement for a solid site architecture that doesn’t create duplicate content in the first place. There is almost always a superior solution to the canonical tag from a pure SEO best practice perspective.
Lets go through some of the URL examples I provided above, this time we'll talk about how to fix themwithout the canonical tag.
Example 1: http://www.example.com/quality-wrenches.htm
This is a duplicate version because our example website resolves with both the www version and the non-www version. If the canonical tag was used to pull the www version out of the index (keeping the non-www version as the canonical one) both versions would still resolve in the browser. With both versions still resolving, both versions can still continue to generate links.
A canonical tag, as with a 301 redirect, does not pass all of the link value from one page to another. It passes most of it, but not all. We estimate that the link value loss with either of these solutions is 1-10%. In this way, a 301 redirect and a canonical tag are the same.
I'd recommend a 301 redirect instead of a canonical tag.
Why, you ask? A 301 redirect takes the link value loss hit once. Once a 301 is in place, a user never lands on the duplicate URL version. They are redirected to the canonical version. If they decide to link to the page, they are going to provide that link to the canonical version. No link love lost. Compare that to the canonical tag solution which keeps both URLs resolving and perpetuates the link value loss.
From Rand's Article: http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
- Whereas a 301 redirect re-points all traffic (bots and human visitors), the Canonical URL tag is just for engines, meaning you can still separately track visitors to the unique URL versions.
- A 301 is a much stronger signal that multiple pages have a single, canonical source. While the engines are certainly planning to support this new tag and trust the intent of site owners, there will be limitations. Content analysis and other algorithmic metrics will be applied to ensure that a site owner hasn't mistakenly or manipulatively applied the tag, and we certainly expect to see mistaken use of the tag, resulting in the engines maintaining those separate URLs in their indices (meaning site owners would experience the same problems noted below).
- 301s carry cross-domain functionality, meaning you can redirect a page at domain1.com to domain2.com and carry over those search engine metrics. This is NOT THE CASE with the Canonical URL tag, which operates exclusively on a single root domain (it will carry over across subfolders and subdomains).
Rel Canonical is a great tool, but I have to disagree here. www.mysite.com is a sub-domain of mysite.com. Adding rel canonical tags to every page on the site would only send a signal to search engines specifying the preferred content, but adding a 301 redirect to the root domain one time will send all traffic, robots, and link juice to the preferred domain on a permanent basis.
-
Hi!
An easier way to fix the problem is by Canonical tags (if you´re not familiar with htaccess or server side scripts).
You find Rand Fishkins amazing article about it here:
http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemapsGood luck!
-
Hi FinalFrontier,
To fix this, you'll just need to choose which version of the domain you'd like to use and then implement a 301 redirect from the domain you don't want displayed to the preferred domain.
My personal choice is the "naked domain" (no "www"). Technically speaking, www.mysite.com is a subdomain of mysite.com and you'll notice that almost every major brand advertises their site without the "www".
When's the last time you saw an Apple commercial trying to convince you to go to www.apple.com? Seen www.eharmony.com anywhere lately?
The choice however is up to you... the key thing is make the decision and when you link to your site from another location stick with one or the other.
To implement the 301 redirect, the most common method is to edit the .htaccess file in the root directory of your site. Also, many hosting control panels (like cPanel) have this functionality built in where it can simply be activated by choosing the appropriate option in your server's configuration.
For www to non-www simply add this to your .htaccess file (replace mysite.com with your own domain)
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.mysite.com [NC]
RewriteRule ^(.*)$ http://mysite.com/$1 [L,R=301]
For the opposite (non-www to www) add this:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^mysite.com [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301]
Hope this helps!
Anthony
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website's Home Page is Missing on Google SERP
Hi All, I have a WordPress website which has about 10-12 pages in total. When I search for the brand name on Google Search, the home page URL isn't appearing on the result pages while the rest of the pages are appearing. There're no issues with the canonicalization or meta titles/descriptions as such. What could possibly the reason behind this aberration? Looking forward to your advice! Cheers
Technical SEO | | ugorayan0 -
I need help with redirecting chain to another page and 301, I don't understand on how to fix
Redirect Chain <label>What it is:</label> Your page is redirecting to a page that is redirecting to a page that is redirecting to a page... and so on. Learn more about redirection best practices. <label>Why it's an issue:</label> Every redirect hop loses link equity and offers a poor user experience, which will negatively impact your rankings. <label>How to fix it:</label> Chiaryn says: “Redirect chains are often caused when multiple redirect rules pile up, such as redirecting a 'www' to non-www URL or a non-secure page to a secure/https: page. Look for any recurring chains that could be rewritten as a single rule. Be particularly careful with 301/302 chains in any combination, as the 302 in the mix could disrupt the ability of the 301 to pass link equity.” This is not helping me I don't understand about the 301 do I use the www.jasperartisanjewelry.com or the /jasperartisanjewelry.com I'm confused
Technical SEO | | geanmitch0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Home page URL
Hi, I work on this site: http://www.towerhousetraining.co.uk/about-us. This is the home page URL. Should this be 301'd to: http://www.towerhousetraining.co.uk? I have created a site map, which I submitted to Google Webmaster Tools, which includes these URL's: /about-us, /training-we-offer & /contact-us. There are a total of 3 pages on the website. Webmaster tools has only indexed 2 out of 3 pages. I think this is something to do with the /about-us URL, as when I do a site: search, these pages appear: www.towerhousetraining.co.uk/, /training-we-offer & /contact-us. I am not sure why Google has indexed the home page as www.towerhousetraining.co.uk/ and not /about-us? Is it a bad idea in general not to have your homepage as your root domain? I added a to the homepage, but am wondering if this was the right thing to do? Any help would be appreciated.
Technical SEO | | CWseo0 -
Pageing page and seo meta tag questions
Hi if i am using paging in my website there is lots of product in my website now in paging total paging is 1000 pages now what title tag i need to add for every paging page or is there any good way we can tell search engine all page or same ?
Technical SEO | | constructionhelpline0 -
Why is my office page not being indexed?
Good Morning from 24 degrees C partly cloudy wetherby UK 🙂 This page is not being indexed by Google:
Technical SEO | | Nightwing
http://www.sandersonweatherall.co.uk/office-to-let-leeds/ 1st Question Ive checked robots txt file no problems, i'm in the midst of updating the xml sitemap (it had the old one in place). It only has one link from this page http://www.sandersonweatherall.co.uk/Site-Map/ So is the reason oits not being indexed just a simple case of lack if SEO juice from inbound links so the remedy lies in routing more inbound links to the offending page? 2nd question Is the quickest way to diagnose if a web address is not being indexed to cut and paste the url in the Google search box and if it doesnt return the page theres a problem? Thanks in advance, David0 -
Getting home page content at top of what robots see
When I click on the text-only cache of nlpca(dot)com on the home page http://webcache.googleusercontent.com/search?q=cache:UIJER7OJFzYJ:www.nlpca.com/&hl=en&gl=us&strip=1 our H1 and body content are at the very bottom. How do we get the h1 and content at the top of what the robots see? Thanks!
Technical SEO | | BobGW0 -
How do I fix these duplicate URLs?
HI guys, I ran a report on my site and it shows some duplicate titles (example below). Do I need to add something to the htaccess file or another file to fix this? I understand that the search engines should only see 1 URL for the page. 2 pages have "Bikes for sale | used bikes | second hand bicycles" title pauslwebsite.com/bikes/ paulswebsite.com/bikes/index.asp Thanks
Technical SEO | | paulmund0