How to Resolve Duplication of HTTPS & HTPP URLs?
-
Right now, I am working on eCommerce website. [Lamps Lighting and More]
I can find out both URLs in website as follow.
HTTP Version:
http://www.lampslightingandmore.com/
HTTPS Version:
https://www.lampslightingandmore.com/
I have check one of my competitor who has implemented following canonical on both pages. Please, view source code for both URLs.
Then, I checked similar thing in SEOmoz website. Why should I not check in SEOmoz because, They are providing best SEO information so may be using best practice to deal with HTTPS & HTTP. LOL
I tried to load following URL so it redirect to home page.
https://www.seomoz.org is redirecting to http://www.seomoz.org
But, following URL is not redirecting any where as well as not set canonical over there.
https://www.seomoz.org/users/settings
I can find out following code on http://www.seomoz.org/robots.txt
**User-agent: ***
** Disallow: /api/user?***
So, I am quite confuse to solve issue. Which one is best 301 redirect or canonical tag? If any live example to see so that's good for me and make me more confident.
-
I have set robots.txt file for HTTP and HTTPS versions. You can find out both file above your response. Thanks for your answer.
-
Our solution to this, was to make sure we had a canonical for each and every page pointing to the http:// version.
Secondly https:// was only made available after logging in.
-
Yep
-
Now, Looks fine... Right??
-
You are right. Because, I have solid confusion after reading article about duplication. I checked my website and found HTTPS and HTTP pages and raising questions in that direction.
-
So, What about canonical tag. I am too confuse with it. What is ultimate conclusion. Because, I have make it live one website after getting suggestion.
Any eCommerce experience which will help me to understand more. What is best solution in my case. My goal is remove duplication in website and improve crawling rate.
-
I believe you're messing things, honestly. 1st > choose a canonical version for your site (www. or not). Sometimes absolute urls can give problems for https version of a site. 2nd > consider if your really want to index the htpps version... If not, put no index or block it via robots.txt. If yes use as canonical tag the http URL of the https page.
-
I would use no índex for the https version of the site, or block it from robots.txt, if i don't want it to be indexed.
-
I want to add similar mind bubble in this question.
http://www.lampslightingandmore.com/
https://www.lampslightingandmore.com/
I have make canonical tag live after discussion over here. But, I have confusion regarding Relative & Absolute URLs.
I am using absolute URLs in canonical tag but, website have relative URLs.
So, Does it create any issue or stop down get benefit of canonical tag?
-
Yes, I don't want to crawl my HTTPS pages and don't want to create duplication by HTTPS and HTTP pages.
-
My question is in same manner. So, why WayFair have set canonical in website?
-
But you don't want your https pages crawled if there's the same version available as http. This is mostly a technical issue, but crawling a https site is way more expensive for both bot and server.
-
How to Resolve Duplication of HTTPS & HTTP URLs?
Neither a redirect nor a canonical tag is necessary.
HTTP, HTTPS, FTP, etc are various protocols used to access information contained on your web server. The data itself is only instanced once, but you can access the data by using these various protocols. It is not a duplication of data and will not cause any SEO issues.
-
301 redirect doesn't exclude a canonical. If you just want to use one solution, use the 301. There was a YouMoz post about exactly this topic a time ago, have look at it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Remove Product & Category from URLS in Wordpress
Does anyone have experience removing /product/ and /product-category/, etc. from URLs in wordpress? I found this link from Wordpress which explains that this shouldn't be done, but I would like some opinions of those who have tried it please. https://docs.woocommerce.com/document/removing-product-product-category-or-shop-from-the-urls/
Intermediate & Advanced SEO | | moon-boots0 -
What to do with parameter urls?
We have a ton of ugly parameter urls that are coming up in google, in semrush, etc. What do we do with them? I know they can cause issues. EX https://www.hibbshomes.com/wp-content/themes/highstand/assets/js/cubeportfolio/js/jquery.cubeportfolio.min.js?ver=6.3
Intermediate & Advanced SEO | | stldanni0 -
Ecommerce Site - Duplicate product descriptions & SKU pages
Hi I have a couple of questions regarding the best way to optimise SKU pages on a large ecommerce site. At the moment we have 2 landing pages per product - one is the primary landing page with no SKU, the other includes the SKU in the URL so our sales people & customers can find it when using the search facility on the site. The SKU landing page has a canonical pointing to the primary page as they're duplicates. Is this the best way? Or is it better to have the one page with the SKU in the URL? Also, we have loads of products with the very similar product descriptions, I am working on trying to include a unique paragraph or few sentences on these to improve the content - how dangerous is the duplicate content within your own site? I know its best to have totally unique content, but it won't be possible on a site with thousands of products and a small team. At the moment I am trying to prioritise the products to update. Thank you 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Http - Https Issue
Hey there Mozzers, I have a site that few months ago went from being http - https. All the links redirect perfect but after scanning my site with Screaming Frog i get a bunch of 503 errors. After looking into my website I see that a lot of links in my content and menu have as a link the http url. For example my homepage has content that interlinks to the http version of the site. And even though when I test it it redirects correctly after scanning with Screaming frog it reports back as 503. Any ideas what's going on? Thanks in advance
Intermediate & Advanced SEO | | Angelos_Savvaidis0 -
How To Organise my URLS - Which is Optimal?
Hi all, I am currently in the process of re-writing my companies website URL structure. Compared to the way the website is structured at the minute, there's going to be a lot more URL's as the previous structure has missed out on a lot of search avenues that i intend to include within the rebuild. one of my issues is basically deciding under which category certain URL's come under, I can think of reasons for both sides but can't quite decide on which is optimal. My company is an automotive/car dealer so we sell cars for certain manufactures as well as offering a number of other services. what I'm curious about is what makes more sense in terms of the category that comes first in the URL. Here's what I am torn between; /(car manufacturer)/servicing OR /servicing/(car-manufacturer) To give you some more info that might influence the decision; In terms of generic keyword targeting, the majority would search in the order of '(car manufacturer) service' as opposed to 'service for (car manufacturer)'. Currently on our site, the sections /(manufacturer) are some of the most authoritative pages that we have on the website, but we've done very little work on /service in the past. For me, this would suggest that naturally the pages flowing from that URL would get an advantage in terms of authority/ranking. With either URL structure, the URL's are eventually going to cross paths - I just need to decide which one is best and should therefore feature first. Hopefully this is somewhat clear. I'd appreciate any suggestions or if you don't quite understand what I'm asking for then general URL advice is also appreciated. Many thanks Sam
Intermediate & Advanced SEO | | Sandicliffe0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
Wordpress and duplicate content
Hi, I have recently installed wordpress and started a blog but now loads of duplicate pages are cropping up for tags and authors and dates etc. How do I do the canonical thing in wordpress? Thanks Ian
Intermediate & Advanced SEO | | jwdl0 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0