Rel=canonical - Identical .com and .us Version of Site
-
We have a .us and a .com version of our site that we direct customers to based on location to servers. This is not changing for the foreseeable future.
We had restricted Google from crawling the .us version of the site and all was fine until I started to see the https version of the .us appearing in the SERPs for certain keywords we keep an eye on.
The .com still exists and is sometimes directly above or under the .us. It is occasionally a different page on the site with similar content to the query, or sometimes it just returns the exact same page for both the .com and the .us results. This has me worried about duplicate content issues.
The question(s): Should I just get the https version of the .us to not be crawled/indexed and leave it at that or should I work to get a rel=canonical set up for the entire .us to .com (making the .com the canonical version)? Are there any major pitfalls I should be aware of in regards to the rel=canonical across the entire domain (both the .us and .com are identical and these newly crawled/indexed .us pages rank pretty nicely sometimes)? Am I better off just correcting it so the .us is no longer crawled and indexed and leaving it at that?
Side question: Have any ecommerce guys noticed that Googlebot has started to crawl/index and serve up https version of your URLs in the SERPs even if the only way to get into those versions of the pages are to either append the https:// yourself to the URL or to go through a sign in or check out page? Is Google, in the wake of their https everywhere and potentially making it a ranking signal, forcing the check for the https of any given URL and choosing to index that?
I just can't figure out how it is even finding those URLs to index if it isn't seeing http://www.example.com and then adding the https:// itself and checking...
Help/insight on either point would be appreciated.
-
Rel=canonical is great for helping search engines serve the correct language or regional URL to searchers, but I'm not sure how it would work for two sites both purposed for the US (.us and .com).
What's the thought behind having two sites - is the .us site intended for Google US searches and .com the default for anything outside of the US? Are there language variations? What are the different "locations" you're referring to?
-
I would set sitewide canonicals from both versions to the .com site. I wouldn't block any pages since people might still stumble and link back to the .us version.
I'm not positive about google auto-checking https versions of websites without any direction but it could be plausible. I know a common way that Google finds https urls is by going to the "My Account" or "My Cart" page which is https, which then changes any relative URLs from http to https, go G re-crawls all of those. Maybe that's what is happening on your end?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Duplicate content and rel canonicals?
Hi. I have a question relating to 2 sites that I manage with regards to duplicate content. These are 2 separate companies but the content is off a data base from the one(in other words the same). In terms of the rel canonical, how would we do this so that google does not penalise either site but can also have the content to crawl for both or is this just a dream?
Technical SEO | | ProsperoDigital0 -
Canonical Advice - ?
Hi everyone, I have a bit of problem with duplicate content on a newly launched site and looking for some advice on which pages to canonicalize. Our legacy site had product "information" pages that now 301 to new product information pages. The reason for the legacy having these pages (instead of pages where you can purchase) is because we used our vendors "cart link", which was an iframe inside the website. So in order to get ranked for these products, we created these pages, that had links to the frame where they could buy. The strategy worked, and we got ranked for our products. Now with the new site, we have those same product information pages, but when you click the link to buy, it goes to a page which now is on our actual site, where you can make the purchase, but this page contains the same basic information, though it looks very different. So my question --- the product "information" pages, are the new 301 homes and are the pages with the rank. The purchase pages are new and have no rank, but are essentially duplicate content. Should I put the canonical link element on the purchase page and tell Google to regard the information pages since those are ranked? It just seems weird to me to direct Google away from the place where people can purchase, however, the purchase pages aren't nearly as "pretty" as the information pages are, and wouldn't be the greatest landing pages. We have an automotive site, and the purchase page you have to enter vehicle information. The information page is nicer, and if the visitor is interested, its just one click to get to that page to buy. What to do here? I am fairly new to Moz, and I couldn't determine whether I am permitted to include an example link from our site of what I am referring to. Is that permitted? Thanks for any help anyone can provide.
Technical SEO | | yogitrout1
Kristin0 -
Why would this site outrank a Pr2 site with higher domain authority?
I am trying to get a pr2 site to be on top 7 local spot for the keyword Van Nuys Bail bonds but have discovered a site which has barely any back links and is not even a year old on top results. Their backlinks are from lower authority domains than what we have. How could this site be beating a 7 year old pr2 website? The site I'm working on is http://bbbail.com/ The site that is ranking in 5th spot local with pr0 is http://www.vipbailbonds.org/ is it maybe because it is a .org site? Also I notice that all websites in top spots have www, could that be a factor as well?
Technical SEO | | jesse13410 -
Canonical URL
I previously set the canonical Url in google web masters to the non www version, when I check my on page opt, it tells me that I have a critical issue with this. Should I change it in google web masters back to the www version? if so is there the possibility of negative results? Or is there a better way to deal with this? Note, I have inbound links pointing to both types.
Technical SEO | | bronxpad0 -
Canonical
I am seeing canonical implementation in many sites for non identical pages. Google honoring these implementation and didn't have any issue. Did anyone have different experience? Thanks.
Technical SEO | | gmk15670 -
Site revision
our site has complete redesign including site architecture, page url and page content (except domain). It looks like a new site. The old site has been indexed about thirty thousand results by google. now what should i do first?
Technical SEO | | jallenyang0 -
What are the pros and cons of moving one site onto a subdomain of another site?
Two sites. One has weaker sales. What would the benefits and problems for SEO of moving the weak site from its own domain to a subdomain of the stronger site?
Technical SEO | | GriffinHansen0