How do fix twin home pages
-
Search engine analysis is indicating that my site has twin home pages (www.mysite.com and http://mysite.com).
The error message I'm getting is: "your website resides at both www.mysite.com and mysite.com.
My uploaded index page is a .htm page (not .html). I don't know if that matters.
Can someone explain how this happened and what I can do to fix it?
Thanks!
-
Hi FinalFrontier,
I agree with setting up a 301 redirect to a single version. I also recommend doing the following:
- Set up canonical URLs to your desired version
- Ensure that your XML sitemaps use your desired version
- Add both www and non-www to Google Webmaster Tools and select one as the URL you'd like displayed in search results
Best of luck!
Chris
-
If you look at the redirect code the webhost provided in their instructions, I notiched there is not a [NC] at the end of the Rewrite Cond line. I'm not sure if that [NC] is necessary or not.
Other than that and the possible time-lag you speak of, I'm at a loss.
-
It could just be a time-lag in our data (and that wouldn't shock me), but run a header checker and make sure the 301 is working properly. For example, try this:
-
Well, this isn't making any sense.
I made the following change to my .htaccess file - followed the instructions given my my web host:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^mysite.com
RewriteRule (.*) http://www.mysite.com/$1 [R=301,L]
Then I ran another seoMoz root crawl a couple hours later and it still said I had the same errors on my home page (duplicate home page content and titles).
I just checked my .htaccess file again and it did save those 301 redirect changes. So why am I still getting duplicate page errors? thx.
-
Yeah, it sounds like you're not currently having major issues. I think it's good to prevent these issues (and duplicates are a real concern), but you can ease into this one, I strongly suspect.
-
Thanks for your post.
Google is indexing all my www pages (including www.mysite.com), but (I guess this is good news?) no documents show up for the:
site:mysite.com -url:www
in Google.
-
Since this issue can occur site-wide, I do tend to agree with Anton that 301-redirects are a better solution for this particular problem (although canonical tags will work, if that's your only feasible option). It is important, as implied in the comments, to make sure hat your internal links are consistent and you aren't using both versions in your site (although, with "www" vs. non-www, that's pretty rare).
Practically, it depends a lot on the size of your site, whether you have links to both versions, and whether Google has indexed both version. This is a problem in theory, but it may not currently be a problem on your site. You can check the indexed pages of both the root domain and www subdomain separately in Google with these commands:
site:mysite.com inurl:www
site:mysite.com -inurl:www
(the first pulls up anything with "www", and the second only pages without it).
If you're seeing both in play, then sorting out how to do the 301-redirects is a good bet. If you're not, then it's still a solid preventive measure, but you don't need to panic.
-
It can have a pretty major impact on search rankings. Basically what's happening is you have two identical pages for every intended page on your site. So it creates duplicate content issues.
So for example...
Someone finds something on your site that they like at www.yoursite.com/example/ and links to it from their site or shares it on Twitter, which increases the ranking power for that page.
Another person finds the same content at yoursite.com/example/ and links to it as well.
Instead of consolidating all the benefits of links to your site onto a single page, you're basically reducing your ranking potential by 50%.
-
How big of an issue is this for search engines? I'm indexed in Bing, Google, Yahoo.
I'm curious as to how big (or small) an impact this really has on a website.
thx.
-
Hi Final Frontier,
Most hosting providers will likely add this to your .htaccess file for you if you contact technical support. I know HostGator will happily provide that kind of help. If not, I'd be glad to add the lines if you'll download the file and email it to me.
-
Thanks but I'm more confused now than ever and I don't know how to change a .htaccess file, so I don't want to turn this into a DYI project and screw things up even more. I get the gist of what the problem is.
All my internal pages link back to www.mysite.com and to www.mysite.com/pages.htm throughout the site.
However, I noticed that for a img src for a facebook page (external link in my site), I am mistakenly linking that to http://mysite.com/facebook (no www). So I'll at least fix that to include www so there's consistency. Not sure if that's related to the problem - there are not other pages I've seen that link to http://mysite.com instead of www.mysite.com.
I've learned a lot here, but this is one technical thing I don't want to do myself and make things worse.
-
From: http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not
There is usually a better solution
The canonical tag is not a replacement for a solid site architecture that doesn’t create duplicate content in the first place. There is almost always a superior solution to the canonical tag from a pure SEO best practice perspective.
Lets go through some of the URL examples I provided above, this time we'll talk about how to fix themwithout the canonical tag.
Example 1: http://www.example.com/quality-wrenches.htm
This is a duplicate version because our example website resolves with both the www version and the non-www version. If the canonical tag was used to pull the www version out of the index (keeping the non-www version as the canonical one) both versions would still resolve in the browser. With both versions still resolving, both versions can still continue to generate links.
A canonical tag, as with a 301 redirect, does not pass all of the link value from one page to another. It passes most of it, but not all. We estimate that the link value loss with either of these solutions is 1-10%. In this way, a 301 redirect and a canonical tag are the same.
I'd recommend a 301 redirect instead of a canonical tag.
Why, you ask? A 301 redirect takes the link value loss hit once. Once a 301 is in place, a user never lands on the duplicate URL version. They are redirected to the canonical version. If they decide to link to the page, they are going to provide that link to the canonical version. No link love lost. Compare that to the canonical tag solution which keeps both URLs resolving and perpetuates the link value loss.
From Rand's Article: http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
- Whereas a 301 redirect re-points all traffic (bots and human visitors), the Canonical URL tag is just for engines, meaning you can still separately track visitors to the unique URL versions.
- A 301 is a much stronger signal that multiple pages have a single, canonical source. While the engines are certainly planning to support this new tag and trust the intent of site owners, there will be limitations. Content analysis and other algorithmic metrics will be applied to ensure that a site owner hasn't mistakenly or manipulatively applied the tag, and we certainly expect to see mistaken use of the tag, resulting in the engines maintaining those separate URLs in their indices (meaning site owners would experience the same problems noted below).
- 301s carry cross-domain functionality, meaning you can redirect a page at domain1.com to domain2.com and carry over those search engine metrics. This is NOT THE CASE with the Canonical URL tag, which operates exclusively on a single root domain (it will carry over across subfolders and subdomains).
Rel Canonical is a great tool, but I have to disagree here. www.mysite.com is a sub-domain of mysite.com. Adding rel canonical tags to every page on the site would only send a signal to search engines specifying the preferred content, but adding a 301 redirect to the root domain one time will send all traffic, robots, and link juice to the preferred domain on a permanent basis.
-
Hi!
An easier way to fix the problem is by Canonical tags (if you´re not familiar with htaccess or server side scripts).
You find Rand Fishkins amazing article about it here:
http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemapsGood luck!
-
Hi FinalFrontier,
To fix this, you'll just need to choose which version of the domain you'd like to use and then implement a 301 redirect from the domain you don't want displayed to the preferred domain.
My personal choice is the "naked domain" (no "www"). Technically speaking, www.mysite.com is a subdomain of mysite.com and you'll notice that almost every major brand advertises their site without the "www".
When's the last time you saw an Apple commercial trying to convince you to go to www.apple.com? Seen www.eharmony.com anywhere lately?
The choice however is up to you... the key thing is make the decision and when you link to your site from another location stick with one or the other.
To implement the 301 redirect, the most common method is to edit the .htaccess file in the root directory of your site. Also, many hosting control panels (like cPanel) have this functionality built in where it can simply be activated by choosing the appropriate option in your server's configuration.
For www to non-www simply add this to your .htaccess file (replace mysite.com with your own domain)
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.mysite.com [NC]
RewriteRule ^(.*)$ http://mysite.com/$1 [L,R=301]
For the opposite (non-www to www) add this:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^mysite.com [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301]
Hope this helps!
Anthony
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Old Product Pages
Hi Issue: I have old versions of a product page in the Google index for a product that I still carry. Why: The URLs were changed when we updated this product page a few years ago. There are four different URLs for this product -- no duplicate content issues b/c we updated the product info, Title tags, etc. So I have a few pages indexed by Google for a particular product. Including a current, up-to-date page. The old pages don't get any traffic, but if I type in google search: "product name" site:store.com then all of the versions of this page appear. The old pages don't have any links to them, only one has any PA, and as I said they don't get any traffic, and the current page is around #8 in google for its keyword. Question: Do these old pages need 301 redirects, should I ask google to remove the old URLs? It seems like Google picks the right version of this page for this keyword query, is it possible that the existence of these other pages (that are not nearly as optimized for the keyword) drag it down a bit in the results? Thanks in advance for any help
Technical SEO | | IOSC0 -
Redirecting over-optimised pages
Hi One of my clients websites was affected by Penguin and due to no 'bad link' messages, and nothing really obvious from the backlink profile, I put it down to over-optimisation on the site. I noticed a lot of spammy pages and duplicate content, and submitted recommendations to have these fixed. They dragged their heels for a while and eventually put in plans for a new site (which was happening anyway), but its taken quite a while and is only just going live in a couple of weeks. My question is, should I redirect the URLs of the previously over-optimised pages? Obviously the new pages are nice and clean and from what I can tell there are no bad links pointing to the URLs, so is this an acceptable practice? Will Google notice this and remove the penalty? Thanks
Technical SEO | | Coolpink0 -
Google doesn't rank the best page of our content for keywords. How to fix that?
Hello, We have a strange issue, which I think is due to legacy. Generally, we are a job board for students in France: http://jobetudiant.net (jobetudiant == studentjob in french) We rank quite well (2nd or 3rd) on "Job etudiant <city>", with the right page (the one that lists all job offers in that city). So this is great.</city> Now, for some reason, Google systematically puts another of our pages in front of that: the page that lists the jobs offers in the 'region' of that city. For example, check this page. the first link is a competitor, the 3rd is the "right" link (the job offers in annecy), but the 2nd link is the list of jobs in Haute Savoie (which is the 'departement'- equiv. to county) in which Annecy is... that's annoying. Is there a way to indicate Google that the 3rd page makes more sense for this search? Thanks
Technical SEO | | jgenesto0 -
How do I fix the h1 tag?
No More Than One H1 Tag Easy fix <dl> <dt>Number of H1s</dt> <dd>2</dd> <dt>Explanation</dt> <dd>Best practices for both SEO and accessibility require only a single H1 tag. The H1 is meant to be the page's headline, and thus, multiple H1s are confusing. Consider employing H2, H3 or CSS styles to achieve the same results with text visualization.</dd> <dt>Recommendation</dt> <dd>Remove multiple instances of the H1 tag, so that only one exists on the page.</dd> <dd>I get this error yet it does not tell me how to fix it. I'm not even sure what the H1 tag is?
Technical SEO | | 678648631264
</dd> </dl>0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0 -
On-Page Question
Im trying to increase value to specific pages by putting history, and additional images. Will copying snippets from other sites negatively affect me? Should the content be re-written completely?
Technical SEO | | Anest0