How to Resolve Duplication of HTTPS & HTPP URLs?
-
Right now, I am working on eCommerce website. [Lamps Lighting and More]
I can find out both URLs in website as follow.
HTTP Version:
http://www.lampslightingandmore.com/
HTTPS Version:
https://www.lampslightingandmore.com/
I have check one of my competitor who has implemented following canonical on both pages. Please, view source code for both URLs.
Then, I checked similar thing in SEOmoz website. Why should I not check in SEOmoz because, They are providing best SEO information so may be using best practice to deal with HTTPS & HTTP. LOL
I tried to load following URL so it redirect to home page.
https://www.seomoz.org is redirecting to http://www.seomoz.org
But, following URL is not redirecting any where as well as not set canonical over there.
https://www.seomoz.org/users/settings
I can find out following code on http://www.seomoz.org/robots.txt
**User-agent: ***
** Disallow: /api/user?***
So, I am quite confuse to solve issue. Which one is best 301 redirect or canonical tag? If any live example to see so that's good for me and make me more confident.
-
I have set robots.txt file for HTTP and HTTPS versions. You can find out both file above your response. Thanks for your answer.
-
Our solution to this, was to make sure we had a canonical for each and every page pointing to the http:// version.
Secondly https:// was only made available after logging in.
-
Yep
-
Now, Looks fine... Right??
-
You are right. Because, I have solid confusion after reading article about duplication. I checked my website and found HTTPS and HTTP pages and raising questions in that direction.
-
So, What about canonical tag. I am too confuse with it. What is ultimate conclusion. Because, I have make it live one website after getting suggestion.
Any eCommerce experience which will help me to understand more. What is best solution in my case. My goal is remove duplication in website and improve crawling rate.
-
I believe you're messing things, honestly. 1st > choose a canonical version for your site (www. or not). Sometimes absolute urls can give problems for https version of a site. 2nd > consider if your really want to index the htpps version... If not, put no index or block it via robots.txt. If yes use as canonical tag the http URL of the https page.
-
I would use no índex for the https version of the site, or block it from robots.txt, if i don't want it to be indexed.
-
I want to add similar mind bubble in this question.
http://www.lampslightingandmore.com/
https://www.lampslightingandmore.com/
I have make canonical tag live after discussion over here. But, I have confusion regarding Relative & Absolute URLs.
I am using absolute URLs in canonical tag but, website have relative URLs.
So, Does it create any issue or stop down get benefit of canonical tag?
-
Yes, I don't want to crawl my HTTPS pages and don't want to create duplication by HTTPS and HTTP pages.
-
My question is in same manner. So, why WayFair have set canonical in website?
-
But you don't want your https pages crawled if there's the same version available as http. This is mostly a technical issue, but crawling a https site is way more expensive for both bot and server.
-
How to Resolve Duplication of HTTPS & HTTP URLs?
Neither a redirect nor a canonical tag is necessary.
HTTP, HTTPS, FTP, etc are various protocols used to access information contained on your web server. The data itself is only instanced once, but you can access the data by using these various protocols. It is not a duplication of data and will not cause any SEO issues.
-
301 redirect doesn't exclude a canonical. If you just want to use one solution, use the 301. There was a YouMoz post about exactly this topic a time ago, have look at it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Issues :(
I am wondering how we can solve our duplicate content issues. Here is the thing: There are so many ways you can write a description about a used watch. http://beckertime.com/product/mens-rolex-air-king-no-date-stainless-steel-watch-wsilver-dial-5500/ http://beckertime.com/product/mens-rolex-air-king-stainless-steel-date-watch-wblue-dial-5500/ Whats different between these two? The dial color. We have a lot of the same model numbers but with different conditions, dial colors, and bands.. What ideas do you have?
Intermediate & Advanced SEO | | KingRosales0 -
Difference in Number of URLS in "Crawl, Sitemaps" & "Index Status" in Webmaster Tools, NORMAL?
Greetings MOZ Community: Webmaster Tools under "Index Status" shows 850 URLs indexed for our website (www.nyc-officespace-leader.com). The number of URLs indexed jumped by around 175 around June 10th, shortly after we launched a new version of our website. No new URLs were added to the site upgrade. Under Webmaster Tools under "Crawl, Site maps", it shows 637 pages submitted and 599 indexed. Prior to June 6th there was not a significant difference in the number of pages shown between the "Index Status" and "Crawl. Site Maps". Now there is a differential of 175. The 850 URLs in "Index Status" is equal to the number of URLs in the MOZ domain crawl report I ran yesterday. Since this differential developed, ranking has declined sharply. Perhaps I am hit by the new version of Panda, but Google indexing junk pages (if that is in fact happening) could have something to do with it. Is this differential between the number of URLs shown in "Index Status" and "Crawl, Sitemaps" normal? I am attaching Images of the two screens from Webmaster Tools as well as the MOZ crawl to illustrate what has occurred. My developer seems stumped by this. He has submitted a removal request for the 175 URLs to Google, but they remain in the index. Any suggestions? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Urls in Bilingual websites
1-I have a bilingual website. Suppose that I am targeting a page for keyword "book" and I have included it in that page url for the English version: English version: www.abc.com/book Can I use the translation of "book" in the second language of the website url instead of "book" ? Please let me know which of the following urls are right " French Verison: www.abc.com/fr/book or www.abc.com/fr/livre livre=Book in French 2- Does Google have any tool to check if the second language page of the website has exactly the same content as the English version. What I want to do is for example for a certain page in English version, my targeted keyword is "book" . So my content would be around books. But in the French version of this page, I want to focus on keyword "Pencil" in French instead of "book". Is it wrong or any consequences? That was the main reason for the question number one. Because if it is ok to do what I explained in item 2 then I will set my urls like: In English : www.abc.com/book In French: www.abc.com/fr/crayon crayon=Pencil in French
Intermediate & Advanced SEO | | AlirezaHamidian0 -
301 forwarding old urls to new urls - when should you update sitemap?
Hello Mozzers, If you are amending your urls - 301ing to new URLs - when in the process should you update your sitemap to reflect the new urls? I have heard some suggest you should submit a new sitemap alongside old sitemap to support indexing of new URLs, but I've no idea whether that advice is valid or not. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Should /node/ URLs be 301 redirect to Clean URLs
Hi All! We are in the process of migrating to Drupal and I know that I want to block any instance of /node/ URLs with my robots.txt file to prevent search engines from indexing them. My question is, should we set 301 redirects on the /node/ versions of the URLs to redirect to their corresponding "clean" URL, or should the robots.txt blocking and canonical link element be enough? My gut tells me to ask for the 301 redirects, but I just want to hear additional opinions. Thank you! MS
Intermediate & Advanced SEO | | MargaritaS0 -
Http, https and link juice
I'm working on a site that is built on DNN. For some reason the client has set all pages to convert to HTTPS (although this is not perfect as some don't when landing on them). All pages indexed in Google are straight HTTP, but when you click on the Google result a temp 302 header response to the corresponding HTTPS page for many. I want it changed to a 301 but unfortunately is an issue for DNN. Is there another way around this in IIS that won't break DNN as it seems to be a bit flaky? I want to have the homepage link juice pass through for all links made to non HTTPS homepage. Removing HTTPS does not seem to be an option for them.
Intermediate & Advanced SEO | | MickEdwards0 -
URL for offline purposes
Hi there, We are going to be promoting one of our products offline, however I do not want to use the original URL for this product page as it's long for the user to type in, so I thought it would be best practice in using a URL that would be short, easier for the consumer to remember. My plan: Replicate the product page and put it on this new short URL, however this would mean I have a duplicate content issue, would It be best practice to use a canonical on the new short URL pointing to the original URL? or use a 301? Thanks for any help
Intermediate & Advanced SEO | | Paul780 -
ECommerce syndication & duplicate content
We have an eCommerce website with original software products. We want to syndicate our content to partner and affiliate websites, but are worried about the effect of duplicate content all over the web. Note that this is a relatively high profile project, where thousands of sites will be listing hundreds of our products, with the exact same name, description, tags, etc. We read the wonderful and relevant post by Kate Morris on this topic (here: http://mz.cm/nXho02) and we realize the duplicate content is never the best option. Some concrete questions we're trying to figure out: 1. Are we risking penalties of any sort? 2. We can potentially get tens of thousands of links from this concept, all with duplicate content around them, but from PR3-6 sites, some with lots of authority. What will affect our site more - the quantity of mediocre links (good) or the duplicate content around them (bad)? 3. Should we sacrifice SEO for a good business idea?
Intermediate & Advanced SEO | | erangalp0