Rel=canonical - Identical .com and .us Version of Site
-
We have a .us and a .com version of our site that we direct customers to based on location to servers. This is not changing for the foreseeable future.
We had restricted Google from crawling the .us version of the site and all was fine until I started to see the https version of the .us appearing in the SERPs for certain keywords we keep an eye on.
The .com still exists and is sometimes directly above or under the .us. It is occasionally a different page on the site with similar content to the query, or sometimes it just returns the exact same page for both the .com and the .us results. This has me worried about duplicate content issues.
The question(s): Should I just get the https version of the .us to not be crawled/indexed and leave it at that or should I work to get a rel=canonical set up for the entire .us to .com (making the .com the canonical version)? Are there any major pitfalls I should be aware of in regards to the rel=canonical across the entire domain (both the .us and .com are identical and these newly crawled/indexed .us pages rank pretty nicely sometimes)? Am I better off just correcting it so the .us is no longer crawled and indexed and leaving it at that?
Side question: Have any ecommerce guys noticed that Googlebot has started to crawl/index and serve up https version of your URLs in the SERPs even if the only way to get into those versions of the pages are to either append the https:// yourself to the URL or to go through a sign in or check out page? Is Google, in the wake of their https everywhere and potentially making it a ranking signal, forcing the check for the https of any given URL and choosing to index that?
I just can't figure out how it is even finding those URLs to index if it isn't seeing http://www.example.com and then adding the https:// itself and checking...
Help/insight on either point would be appreciated.
-
Rel=canonical is great for helping search engines serve the correct language or regional URL to searchers, but I'm not sure how it would work for two sites both purposed for the US (.us and .com).
What's the thought behind having two sites - is the .us site intended for Google US searches and .com the default for anything outside of the US? Are there language variations? What are the different "locations" you're referring to?
-
I would set sitewide canonicals from both versions to the .com site. I wouldn't block any pages since people might still stumble and link back to the .us version.
I'm not positive about google auto-checking https versions of websites without any direction but it could be plausible. I know a common way that Google finds https urls is by going to the "My Account" or "My Cart" page which is https, which then changes any relative URLs from http to https, go G re-crawls all of those. Maybe that's what is happening on your end?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking for combined version of keyword but not separated version
Hi All, My site is currently ranking on page 1 for the term "golfholidays" but is ranking at the bottom of page 3 for the term I am targeting and have optimised for, which is "golf holidays". Does anyone have any experience with the combined keyword ranking above the singular version? Nowhere on my page doesn't it mention the term "golfholidays" and backlinks to my site mostly use the anchor "golf holdiays" Thanks!
Technical SEO | | Andy94120 -
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Is it problematic for Google when the site of a subdomain is on a different host than the site of the primary domain?
The Website on the subdomain runs on a different server (host) than the site on the main domain.
Technical SEO | | Christian_Campusjaeger0 -
How do we keep Google from treating us as if we are a recipe site rather than a product website?
We sell food products that, of course, can be used in recipes. As a convenience to our customer we have made a large database of recipes available. We have far more recipes than products. My concern is that Google may start viewing us as a recipe website rather than a food product website. My initial thought was to subdomain the recipes (recipe.domain.com) but that seems silly given that you aren't really leaving our website and the layout of the website doesn't change with the subdomain. Currently our URL structure is... domain.com/products/product-name.html domain.com/recipes/recipe-name.html We do rank well for our products in general searches but I want to be sure that our recipe setup isn't detrimental.
Technical SEO | | bearpaw0 -
Is this an ideal rel=canonical situation?
Hey Moz community, Thanks for taking time to answer my question. I'm working directly with a hospital that has several locations across the country. They've copied the same content over to each of their websites. Could I point the search engines back to a singular location (URL) using the rel=canonical tag? In addition, does the rel=canonical tag affect the search engine rankings of the URLs (about 13 of them) that use the rel=canonical tag? If I'm on track, is there an ideal URL (location) to decide has the original content? This is actually the first time I've ever needed to use rel=canonical (if applicable). Thanks so much. Cole
Technical SEO | | ColeLusby0 -
Will an identical site impact SERP results
I came across two identical sites for two different business owners in the same industry. I'm sure you've seen these. A web company offers individuals in the same profession a template site with the exact same content for each site. All that is different is the domain. i.e. mycompany.com/news/topicsname will have the exact same content, images, tags, etc. as mycompany2.com/news/topicsname. I would assume having the duplicate content, especially if two site owners are in the same town, will ultimately hurt the rankings of at least one site. Is this correct? Thank you for your help.
Technical SEO | | STF0 -
Www. version of my site shows nothing in Open Site Explorer
When I first setup my site the domain was learnbonds.com. I moved hosts a couple of months ago and as part of the process I asked them to make the site show as www.learnbonds.com which they did. Now however when I goto www.learnbonds.com in open site explorer it says there is no data. When I enter learnbonds.com into open site explorer it gives me data but says that the site has been redirected to the www. version which shows no data. Also in google webmaster when I try to set the preferred domain as the www. version it gives me the following message: Part of the process of setting a preferred domain is to verify that you own http://www.learnbonds.com/. Please verify http://www.learnbonds.com/. I am concerned that this is hurting my SEO and would appreciate any advice you can give. Thanks Dave
Technical SEO | | fxtrader19790