Https-pages still in the SERP's
-
Hi all,
my problem is the following: our CMS (self-developed) produces https-versions of our "normal" web pages, which means duplicate content.
Our it-department put the <noindex,nofollow>on the https pages, that was like 6 weeks ago.</noindex,nofollow>
I check the number of indexed pages once a week and still see a lot of these https pages in the Google index. I know that I may hit different data center and that these numbers aren't 100% valid, but still... sometimes the number of indexed https even moves up.
Any ideas/suggestions? Wait for a longer time? Or take the time and go to Webmaster Tools to kick them out of the index?
Another question: for a nice query, one https page ranks No. 1. If I kick the page out of the index, do you think that the http page replaces the No. 1 position? Or will the ranking be lost? (sends some nice traffic :-))...
thanx in advance
-
Hi Irving,
yes, you are right. The https login page is the "problem", other pages that I visit after are staying on https, as all the links on these page are https links. So you could surf all the pages on the domain in a https mode, if you visited the login page before
I spoke to our it department about this problem and they told me it would take time to program our CMS different. My boss then told me to find another, cheaper solution - so I came up with the noindex,nofollow.
So, do you see another solution whithout having to ask our it department again? They< are always very busy and almost have no time for nobody
-
Hi Malcolm,
thankx for the help. Before we put the noindex, nofollow on these pages, I thought about using the rel=canonical.
To be honest, I did not choose rel=canonical because I think that the noindex,nofollow ia a stronger sign for Google, and that the rel=canonical is more like a hint, which G does not always follow... but sure, i can be wrong!
You are saying that the noindex could end worse. The https-pages only contain links to https-pages, think of these pages like "normal" pages, same content, link structure etc. etc. Every URL just is a https, internal, external....
So I thought the noindex,nofollow would not hurt the http pages, because they cannot be found on the https ones - what do you think?
-
Is there a reason you're supporting both http and https versions of every page? If not, 301 redirect to either http or https for each page. I'd only leave pages that need to be secure as https, e.g. purchase pages. Non-secure pages are generally a better user experience in terms of load time since the user can use cached files from previous pages and non-encrypted pages are more lightweight.
If you're out to support both for those secure users who like https everywhere, I'd go with Malcolm's solution and rel canonical to the version you'd like to have indexed rather than using noindex nofollow.
-
do you have absolute links on your site that are keeping https?
For example, if you go to a secure login page and then click a homepage navigation link on the secure https page do you see the homepage link going back to http or staying on https?
That is usually the cause of this problem you should look into that. I would not manually request removal of the pages in WMT i would just fix the problem and let google update it itself.
-
have you tried canonicalising the http version?
Using a noindex nofollow rule could end up being worse as you are telling Google not to follow the pages or index them and this will include both http and https.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
HTTPS for form pages?
I am creating a small business website for a friend in Recruitment. It’s very small and mainly just a shop window for the business. There’s no login area for the website, but there are two areas were users can enter information: General contact us form (giving email and phone number) Applying for a job (attaching a resume) The forms are using Ninja Forms – which I believe are secure in passing information. But am I missing anything? Do I need to make these pages https at all? I’m quite new to building sites from scratch. Thanks for your help
Technical SEO | | joberts0 -
Why is robots.txt blocking URL's in sitemap?
Hi Folks, Any ideas why Google Webmaster Tools is indicating that my robots.txt is blocking URL's linked in my sitemap.xml, when in fact it isn't? I have checked the current robots.txt declarations and they are fine and I've also tested it in the 'robots.txt Tester' tool, which indicates for the URL's it's suggesting are blocked in the sitemap, in fact work fine. Is this a temporary issue that will be resolved over a few days or should I be concerned. I have recently removed the declaration from the robots.txt that would have been blocking them and then uploaded a new updated sitemap.xml. I'm assuming this issue is due to some sort of crossover. Thanks Gaz
Technical SEO | | PurpleGriffon0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
New Website, New URL, New Content - What do we do with the old site? Are 301's the only option?
We've just built a new site for a client. They were adamant on changing the url. The new site is entirely new content, however the subject mater is the same. Some pages are even titled very similarly. Is is advisable to keep the old site running, and link it to the new site? Permanently, or temporarily? Do we simply place redirects from the old site the new? Old site was 30 pages, new site is 80 pages. So redirects won't be available to all the new pages. It seems a shame to trash the old site, it is getting some good traffic, and the content - although outdated is unique and of a high quality. Old url is 4+ yrs old, the new url is new. Some enlightened opinions would be greatly welcomed. Thanks
Technical SEO | | MarketsOnline0 -
Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO?
Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO? Is there a better way to do what I'm trying to do?
Technical SEO | | Ocularis0 -
What's the SEO impact of url suffixes?
Is there an advantage/disadvantage to adding an .html suffix to urls in a CMS like WordPress. Plugins exist to do it, but it seems better for the user to leave it off. What do search engines prefer?
Technical SEO | | Cornucopia0 -
Does removing product listings help raise SERP's on other pages?
Does removing content ever make sense? We have out of stock products that are left on the site (in an out of stock section) specifically for SEO value, but I am not sure how to approach the problem from a bottom line conversion stand point. Do we leave out of stock products and hope that they turn into a conversion rate via cross selling, or do out of stock products lower the value of other pages by "stealing" link juice and pagerank from the rest of the site? (and effectively driving interest away) What is your perspective? Do you believe that any content that is related or semi-related to your main focus is beneficial, or does it only make sense to have strong content that has a higher rate of conversion and overall site engagement?
Technical SEO | | 13375auc30