Rel=canonical - Identical .com and .us Version of Site
-
We have a .us and a .com version of our site that we direct customers to based on location to servers. This is not changing for the foreseeable future.
We had restricted Google from crawling the .us version of the site and all was fine until I started to see the https version of the .us appearing in the SERPs for certain keywords we keep an eye on.
The .com still exists and is sometimes directly above or under the .us. It is occasionally a different page on the site with similar content to the query, or sometimes it just returns the exact same page for both the .com and the .us results. This has me worried about duplicate content issues.
The question(s): Should I just get the https version of the .us to not be crawled/indexed and leave it at that or should I work to get a rel=canonical set up for the entire .us to .com (making the .com the canonical version)? Are there any major pitfalls I should be aware of in regards to the rel=canonical across the entire domain (both the .us and .com are identical and these newly crawled/indexed .us pages rank pretty nicely sometimes)? Am I better off just correcting it so the .us is no longer crawled and indexed and leaving it at that?
Side question: Have any ecommerce guys noticed that Googlebot has started to crawl/index and serve up https version of your URLs in the SERPs even if the only way to get into those versions of the pages are to either append the https:// yourself to the URL or to go through a sign in or check out page? Is Google, in the wake of their https everywhere and potentially making it a ranking signal, forcing the check for the https of any given URL and choosing to index that?
I just can't figure out how it is even finding those URLs to index if it isn't seeing http://www.example.com and then adding the https:// itself and checking...
Help/insight on either point would be appreciated.
-
Rel=canonical is great for helping search engines serve the correct language or regional URL to searchers, but I'm not sure how it would work for two sites both purposed for the US (.us and .com).
What's the thought behind having two sites - is the .us site intended for Google US searches and .com the default for anything outside of the US? Are there language variations? What are the different "locations" you're referring to?
-
I would set sitewide canonicals from both versions to the .com site. I wouldn't block any pages since people might still stumble and link back to the .us version.
I'm not positive about google auto-checking https versions of websites without any direction but it could be plausible. I know a common way that Google finds https urls is by going to the "My Account" or "My Cart" page which is https, which then changes any relative URLs from http to https, go G re-crawls all of those. Maybe that's what is happening on your end?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Sites website https://www.opcfitness.com/ title NOT GOOD FOR SEO
We set up a website https://www.opcfitness.com/home on google sites. but google sites page title not good for SEO. How to fix it?
Technical SEO | | ahislop5740 -
GWT Fetch & Render displays desktop version of site as mobile
Hi team, I noticed that when I request a desktop rendering in GWT using fetch and render, pages render as the mobile version. Screenshot attached. It's related to the VHS units in our CSS (as far as I'm aware). Does anyone know what the implications of this may be? Does it mean googlebot can only see the mobile version of our website? Any help is appreciated. Jake jgScJ
Technical SEO | | Jacobsheehan0 -
Mobile site content and main site content
Help, pls! I have one main site and a mobile version of that site (m.domain.com). The main site has more pages, more content, different named urls. The main site has consistently done well in Google. The mobile site has not: the mobile site is buried. I am working on adding more content to the mobile site, but am concerned about duplicate content. Could someone pls tell me the best way to deal with these two versions of our site? I can't use rel=canonical because the urls do not correspond to the same names on the main site, or can I? Does this mean I need to change the url names, offer different content (abridged), etc? I really am at a loss as to how to interpret Google's rules for this. Could someone please tell me what I am doing wrong? Any help or tips would GREATLY appreciated!!!!! Thanks!
Technical SEO | | lfrazer0 -
Rel="canonical" again
Hello everyone, I should rel="canonical" my 2 languages website /en urls to the original version without /en. Can I do this from the header.php? Should I rel="canonical" each /en page (eg. en/contatti, en/pagina) separately or can I do all from the general before the website title? Thanks if someone can help.
Technical SEO | | socialengaged0 -
Rel=author
Hi everyone, i'm trying to understand the rel=author thing for cotent, i need some clarification please. Firstly do you only use it for content on your site or can you have it for a guest post you have done on another domain which is not your own - linking to your author profile on your domain? Secondly implementing it, i understand it's 3 links: 1., Link on your content where the blog post is with a rel=author going to your domain authort page. 2., a link from your domain author page going to your google + profile. This is rel=me 3 a link on your google+ profile to your blog? if so how do i do this? i only have an option to edit about page and add recommended links? there is no 'contributor' section. I am UK profile also. Any help really appreciated, thanks guys.
Technical SEO | | pauledwards0 -
Is it a good idea to make 301 from a site which you know google has banned certain keywords for to a new site with similar content
Here is a short question re. 301. I read Dovers article on how to move an old domain to a new one. Say you have been a little inexperienced regarding linkbuilding and used some cheap service in the past and you have steadily seen that certain keywords have been depreciating in the SERP - however the PR is still 3 for the domain - now the qustion is should you rediect with a 301 in .htaccess to a new domain when you know that google does not like certain keywords with respect to the old site. Will the doom and gloom carry over to the new site?
Technical SEO | | Kofoed0 -
My site ranking
Hello, I have a website and working more than 1 year ago,I worked hard last year and paid alot to make guys write articles from my website to other forums so my keywords rank high and got good visitors, then I get in much care in SEO and found SEOMoz with is very nice,when I downloaded the tool bar it was a shock to find my website is almost zero although the big effort I had, I can do more but I need to guide what I exactly need to improve my website,I almost read alot of the beginner PDF and got good information to work with and can hire people to help too. I did a real big work sharing my subjects and i can see them in top#5 google but for other sites and now i found I am still zero 😞 adding my links inside also didnt help or counted. attached the statistics of the website and the competitors site to let me know which important things to take care to jump over. would be very thankful for detailed help, Best Regards 1_01308477251.png 1_01308477465.png
Technical SEO | | nesr_20200 -
Rel="canonical" for PFDs?
Hello there, We have a lot of PDFs that seem to end up on other websites. I was wondering if there was a way to make sure that our website gets the credit/authority as the original creator. Besides linking directly from the PDF copy to our pages, is anyone aware of strategy for letting Google know that we are the original publishers? I know search engines can index HTML versions of PDFs, so is there anyway to get them to index a rel="canonical" tag as well? Thoughts/Ideas?
Technical SEO | | Tektronix0