Http & https canonicalization issues
-
Howdyho
I'm SEOing a daily deals site that mostly runs on https Versions. (only the home page is on http). I'm wondering what to do for canonicalization. IMO it would be easiest to run all pages on https.
But the scarce resources I find are not so clear. For instance, this Youmoz blog post claims that https is only for humans, not for bots! That doesn't really apply anymore, right?
-
cool, thank for your backup on this. I figured that the bot-redirect described would be a bit over the top. And great to know that having a http-homepage on an otherwise https domain is a non-issue. much appreciated!
-
I wouldn't redirect just bots to http instead of https. That is a lot of work and honestly not worth the effort. I have worked on plenty of sites that require https across the entire site and Google crawls those with no issue.
As for your website, I wouldn't worry too much about http v. https for the canonical. However, if you can I would keep it consistent on a page level. It would be fine to say that the canonical for the about page is https://www.domain.com/about but the canonical for the home page is http://www.domain.com. I've had to do this on a number of ecomerce sites where the product pages are https but the rest of the site is http.
I hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Discovered - currently not indexed issue
Hello all, We have a sitemap with URLs that have mostly user generated content. Profile Overview section. Where users write about their services and some other things. Out of 46K URLs, only 14K are valid according to search console and 32K URLs are excluded. Out of these 32K, 28K are "Discovered - currently not indexed". We can't really update these pages as they have user generated content. However we do want to leverage all these pages to help us in our SEO. So the question is how do we make all of these pages indexable? If anyone can help in the regard, please let me know. Thanks!
Technical SEO | | akashkandari0 -
Hello I have a Magento Store View issue!
Hi guys hope you can help - we have recenetly added a couple of "store views" to our magento website, in order for us to break down orders from channels that feed through M2e pro. What is great is that it gives us a little break down on amazon, ebay and the store sales separate, depending on which we select. However - we have ran into a issue with duplicate content, mainly on pages such as the checkout? /checkout/?___store=default&___from_store=m2e_ebay /checkout/?___store=default&___from_store=m2e_amazon Issue being the above - I am hoping we can keep how we have broken down the store for a quick reporting instance by day. However how do / if I can - get round this?? Surely if I was to redirect on the checkout it would cause issues. Any advice greatly appreciated Kelly
Technical SEO | | KellyLloyd6660 -
How do I avoid this issue of duplicate content with Google?
I have an ecommerce website which sells a product that has many different variations based on a vehicle’s make, model, and year. Currently, we sell this product on one page “www.cargoliner.com/products.php?did=10001” and we show a modal to sort through each make, model, and year. This is important because based on the make, model, and year, we have different prices/configurations for each. For example, for the Jeep Wrangler and Jeep Cherokee, we might have different products: Ultimate Pet Liner - Jeep Wrangler 2011-2013 - $350 Ultimate Pet Liner - Jeep Wrangler 2014 - 2015 - $350 Utlimate Pet Liner - Jeep Cherokee 2011-2015 - $400 Although the typical consumer might think we have 1 product (the Ultimate Pet Liner), we look at these as many different types of products, each with a different configuration and different variants. We do NOT have unique content for each make, model, and year. We have the same content and images for each. When the customer selects their make, model, and year, we just search and replace the text to make it look like the make, model, and year. For example, when a custom selects 2015 Jeep Wrangler from the modal, we do a search and replace so the page will have the same url (www.cargoliner.com/products.php?did=10001) but the product title will say “2015 Jeep Wrangler”. Here’s my problem: We want all of these individual products to have their own unique urls (cargoliner.com/products/2015-jeep-wrangler) so we can reference them in emails to customers and ideally we start creating unique content for them. Our only problem is that there will be hundreds of them and they don’t have unique content other than us switching in the product title and change of variants. Also, we don’t want our url www.cargoliner.com/products.php?did=10001 to lose its link juice. Here’s my question(s): My assumption is that I should just keep my url: www.cargoliner.com/products.php?did=10001 and be able to sort through the products on that page. Then I should go ahead and make individual urls for each of these products (i.e. cargoliner.com/products/2015-jeep-wrangler) but just add a “nofollow noindex” to the page. Is this what I should do? How secure is a “no-follow noindex” on a webpage? Does Google still index? Am I at risk for duplicate content penalties? Thanks!
Technical SEO | | kirbyfike0 -
Duplicate content and canonicalization confusion
Hello, http://bit.ly/1b48Lmp and http://bit.ly/1BuJkUR pages have same content and their canonical refers to the page itself. Yet, they rank in search engines. Is it because they have been targeted to different geographical locations? If so, still the content is same. Please help me clear this confusion. Regards
Technical SEO | | IM_Learner0 -
Should we dump the https from a client site?
We inherited a site that has both http and https. No e-commerce or data transfer...just html. Should we dump the https certificate? I think it might be causing issues with indexing and possible duplicate content. The https site has a certificate warning message...not good. The URL is www.charlottemechanical.com
Technical SEO | | theideapeople0 -
WordPress - How to stop both http:// and https:// pages being indexed?
Just published a static page 2 days ago on WordPress site but noticed that Google has indexed both http:// and https:// url's. Usually I only get http:// indexed though. Could anyone please explain why this may have happened and how I can fix? Thanks!
Technical SEO | | Clicksjim1 -
OUTGOING LINK ISSUES
NO LINK TO HOME MY WEBSIDE PROFITIOK.COM FOR EXAMPLE YOU SEE THE PAGE I HAVE CHANGE TWO DAYS BACK PREVIOUS
Technical SEO | | Luckykhullar
ABOUT US - Profitok.com 2 DAYS BACK I CHANGED THE TITLE title>Commodity tips | Mcx TIPS | Mcx Daily Tips |Mcx Good tips Provider BUT WHEN I CHECKING IT IS NOT UPDATED THIS TITLE DEAR SIR CAN Y GIVE ME THE CORRECT ANSWER0 -
MSNbot Issues
We found msnbot is doing lots of request at same time to one URL, even considering we have caching, it triggers many requests at same time so caching does not help at the moment: For sure we can use mutex to make sure URL waits for cache to generate, but we are looking for solution for MSN boot. 123.253.27.53 [11/Dec/2012:14:15:10 -0600] "GET //Fun-Stuff HTTP/1.1" 200 0 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)" 1.253.27.53 [11/Dec/2012:14:15:10 -0600] "GET //Type-of-Resource/Fun-Stuff HTTP/1.1" 200 0 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)" 1.253.27.53 [11/Dec/2012:14:15:10 -0600] "GET /Browse//Fun-Stuff HTTP/1.1" 200 6708 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)" We found the following solution: http://www.bing.com/community/site_blogs/b/webmaster/archive/2009/08/10/crawl-delay-and-the-bing-crawler-msnbot.aspx Bing offers webmasters the ability to slow down the crawl rate to accommodate web server load issues. User-Agent: * Crawl-Delay: 10 Need to know if it’s safe to apply that. OR any other advices. PS: MSNBot gets so bad at times that it could trigger a DOS attack – alone! (http://www.semwisdom.com/blog/msnbot-stupid-plain-evil#axzz2EqmJM3er).
Technical SEO | | tpt.com0