If a site has https versions of every page, will the search engines view them as duplicate pages?
-
A client's site has HTTPS versions of every page for their site and it is possible to view both http and https versions of the page.
Do the search engines view this as duplicate content?
-
You can force https on the server or programicly
-
301'ing the https pages is not the solution. once you fix the https page that is leaking https to the rest of the site google will automatically weed out the https pages from their index. doing 301's is just putting a band aid on the problem and not fixing the source of the problem.
-
yes, it can create duplicate content issues - you can have non www https and www https pages so just setting a preference in GWMT will not fix this issue
you need to force URLs coming off of secure pages (such as login) to go back to http when the spider follows a non https page on a https page.
-
My personal opinion is yes you will run into duplicate page issues. But then again one of my clients site has over 500 pages just like that and it ranks fine.. However if fixed who knows what could happen but they don't want to invest the time and money into it.
-
By the way, I appreciate the best practice, i.e. rel canonical or 301s, which is what we are doing, I am having a debate about what happens if you don't do it, hence my question. I am suggesting dupe page issues.
-
In Short... YES.
But its an easy fix. Just use the google webmaster tools http://www.google.com/support/webmasters/bin/answer.py?answer=44231 preferred domain. That should fix the issue for you
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Contents in Order Pages of Multiple Products
Hi, I have a website containing 30 software products. Each product has an order page. The problem is that the layout and content of these 30 order pages are very similar, except for the product name, for example: https://www.datanumen.com/access-repair-order/
On-Page Optimization | | ccw
https://www.datanumen.com/outlook-repair-order/
https://www.datanumen.com/word-repair-order/ Siteliner has reports these pages as duplicate contents. I am thinking of noindex these pages. However, in such a case, if a user search for "DataNumen Outlook Repair order page", then he will not be able to see the order page of our product, which drives the revenue go away. So, how to deal with such a case? Thank you.1 -
Should I change our main category pages to product listing pages?
With the thought of improving user experience, as well as rankings in Google, I'm considering changing our main category pages to product listing pages (with sub-categories remaining, still). These main category pages are very standard and don't link to any informational content, such as buyers guides, etc. What's driven this is the latest Google core update. I've noticed our main competitor (who we were out-ranking before... but not now) now uses this approach. I can see the benefit from a user perspective, i.e. less clicks to reach products. What's the pros/cons from an SEO point of view, please? Could the potential duplication of content be an issue? For context, we have about 2,000 products and website is on Magento 2.
On-Page Optimization | | alifeofjoy1 -
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
Numerous duplicate destination URLs from within one menu - potential impact for on-page SEO?
Hello all What is your evaluation in regards to a number of links (different anchors) targeting the same destination URL from within one and the same menu (on the same website)? Keeping it brief: Think of a top menu drop down entry, that needs to feature the alphabet (each letter has it's own sub-entries). However, the actual letter itself is not represented by a page (it has no URL either). So far so good. However, when testing the menu on a mobile device, the letter entries are still treated, as if they were non-existent pages - thus throwing a 404 when clicked. In order to avoid people getting a 404 when clicking on any letter, it would be ideal, if they were directed to any main page (the same destination URL though). However, that would mean 26 times the same destination URL from within that menu. Is this approach potentially bad for SEO, hence there would be numerous duplicate destination URLs in place? Please mind, I am not inquiring for help on how to arrange the actual menu. I am concerned about the impact, identical destination URLs could have on the on-page SEO. Many thanks in advance for your help and input!
On-Page Optimization | | Hermski0 -
Will pushing a visitor to a conversion page hosted on a 3rd-party domain hurt the landing page ranking
Had an interesting question from a client. The client has a page that is optimized for a specific term. The goal of the page is to push users to sign-up for a trial. The trial registration (conversion) page is hosted by a third-party. Will pushing users to the conversion page cannibalize the SEO authority of the landing page. My reflexive answer is to say no, but now am not so sure.
On-Page Optimization | | infoblue0 -
Will Google Custom Search results on my home page kill it's ranking?
This is probably a dumb question, but here goes anyway. 🙂 On a site I have it would be very useful to the reader to offer a search box that uses a Google Custom Search that I have optimized to search websites that are closely on-topic with my site. I know it sounds bad that I would send people to other sites, but just assume that the reasons are valid for this discussion. My question is, if the search results are set to display on the same page (the home page) as the search box, will the links in the search results to external sites just bleed my page rank to death? I assume it would, but thought I'd check just in case I'm missing something. I have to option to place the results on separate page of my site, and noindex it, but it won't be as powerful as it would be on the home page.
On-Page Optimization | | bizzer0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Potential Duplicate Title Tags On Sibling Pages
Edit I'll take the fall on this one, seems I could have asked my quesiton in a more clear manner. I was cruising other questions and finding a whole of answers that I suspect were not truly intended to help, but maybe help and earn Mozpoints. Wasn't fair of me to label those answering here with that. I will work better on the wording of my questions! 🙂 Edit Either I am asking my question poorly or I am learning there may be a rush to get points by throwing up any old answer...it very well may be the former which I am open to feedback on. Each page is to stand alone and hopefully rank well for the neighbourhood name and in conjunction with another relevant keyword phrase. There is no 'duplicate' version of any pages. * On a site there are numerous pages that provide real estate listings broken down by neighbourhood. Each containing similar content, a abbreviated version of the listings, often spanning 2 or 3 pages. These are 3rd level pages. Properties->Calgary Neighbourhoods->Evanston The title tags created are: Evanston Homes For Sale - NW Calgary Real Estate Panorama Hills Home For Sale - NW Calgary Real Estate Etc. for about 15 or so pages. Then they start again for another area of the city: Sagewood Homes For Sale - Airdrie Real Estate Woodside Homes For Sale - Airdrie Real Estate At this point there is no text on the actual page outside of the listings...an example of similar listings on another site - http://www.experiencerealtygroup.com/BaturynandDunluceHomes.ubr Do you think the SE's will see these as 'proper' use of the Title Tag or duplicate or other practices they tend to frown upon? It is a logical way of creating the title and obviously creating a unique version for each page would not only be tough to scale on some sites with 100's of these pages, they would become a little silly and not much use to the searcher in the SERPs Thanks for any help!
On-Page Optimization | | kyegrace1