How can I get Google to forget an https version of one page on my site?
-
Google mysteriously decided to index the broken, https version of one page on my company's site (we have a cert for the site, but this page is not designed to be served over https and the CSS doesn't load).
The page already has many incoming links to the http version, and it has a canonical URL with http. I resubmitted it on http with webmaster tools. Is there anything else I could do?
-
Are both pages being indexed? If so, you won't lose the http:// version because googlebots are treating them as two separate pages.
-
Thanks Hilary - I'm concerned that if I do that, I would lose the http version as well.
-
I would also add Google Webmaster Tools to the site and head to the area where you can ask Google to remove a SERP listing.
-
Hi Joel,
Sounds like you have everything setup correctly so far with the canonical and GWT.
Next step would be to do a 301 redirect from the https version to the http version. After that and your changes above, it should get remove from Google's index.
Note: You may also want to run a scan with a program like Screaming Frog to ensure that you are not accidentally linking to the https version somewhere on your site or in your sitemap.
Hope this helps.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Deindexing Site, but Reindexing 301 Redirected Version
A bit of a strange one, a client's .com site has recently been losing rankings on a daily basis, but traffic has barely budged. After some investigation, I found that the .co.uk domain (which has been 301 redirected for some years) has recently been indexed by Google. According to Ahrefs the .co.uk domain started gaining some rankings in early September, which has increased daily. All of these rankings are effectively being stolen from the .com site (but due to the 301 redirect, the site loses no traffic), so as one keyword disappears from the .com's ranking, it reappears on the .co.uk's ranking report. Even searching for the brand name now brings up the .co.uk version of the domain whereas less than a week ago the brand name brought up the .com domain. The redirects are all working fine. There's no instance of any URLs on the site or in the sitemaps leading to the .co.uk domain. The .co.uk domain does not have any backlinks except for a single results page on ask.com. The site hasn't recently had any design or development done, the last changes being made in June. Has anyone encountered this before? I'm not entirely sure how or why Google would start indexing 301'd URLs after several years of not indexing these.
Technical SEO | | lyuda550 -
Some of my website urls are not getting indexed while checking (site: domain) in google
Some of my website urls are not getting indexed while checking (site: domain) in google
Technical SEO | | nlogix0 -
Mobile site not getting indexed
My site is www.findyogi.com - a shopping comparison site The mobile site is hosted at m.findyogi.com I fixed my sitemap and attribution to mobile site in May last week. My mobile site pages are getting de-indexed since then. Website - www.findyogi.com/mobiles/motorola/motorola-moto-g-16gb-b95ef8/price - indexed Mobile - m.findyogi.com/mobiles/motorola/motorola-moto-g-16gb-b95ef8/price - _not indexed. _ Google is crawling my website and mobile site normally. What am I am doing wrong?
Technical SEO | | namansr0 -
Is Google suppressing a page from results - if so why?
UPDATE: It seems the issue was that pages were accessible via multiple URLs (i.e. with and without trailing slash, with and without .aspx extension). Once this issue was resolved, pages started ranking again. Our website used to rank well for a keyword (top 5), though this was over a year ago now. Since then the page no longer ranks at all, but sub pages of that page rank around 40th-60th. I searched for our site and the term on Google (i.e. 'Keyword site:MySite.com') and increased the number of results to 100, again the page isn't in the results. However when I just search for our site (site:MySite.com) then the page is there, appearing higher up the results than the sub pages. I thought this may be down to keyword stuffing; there were around 20-30 instances of the keyword on the page, however roughly the same quantity of keywords were on each sub pages as well. I've now removed some of the excess keywords from all sections as it was getting in the way of usability as well, but I just wanted some thoughts on whether this is a likely cause or if there is something else I should be worried about.
Technical SEO | | Datel1 -
Can't get Google to Index .pdf in wp-content folder
We created an indepth case study/survey for a legal client and can't get Google to crawl the PDF which is hosted on Wordpress in the wp-content folder. It is linked to heavily from nearly all pages of the site by a global sidebar. Am I missing something obvious as to why Google won't crawl this PDF? We can't get much value from it unless it gets indexed. Any help is greatly appreciated. Thanks! Here is the PDF itself:
Technical SEO | | inboundauthority
http://www.billbonebikelaw.com/wp-content/uploads/2013/11/Whitepaper-Drivers-vs-cyclists-Floridas-Struggle-to-share-the-road.pdf Here is the page it is linked from:
http://www.billbonebikelaw.com/resources/drivers-vs-cyclists-study/0 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0 -
"Site Suspended" in Google Adwords + Lost all rankings in Google => is this related?
Can anyone share thoughts on this: Does the S recently (mid april) we revamped our website (same content, new layout, strong brand), but a few days later our google rep contacted us to tell that she got a "red flag" for one of our SEA campaigns (we broke the bridge page policy, not on purpose to be clear), they were completely correct on this matter. We even got some extra time to correct this, normal policy is only 10 days. But, we were a little slow, so all our Adwords Campaigns are suspended and we get the message "Site suspended". We are working to have this fixed, our Google rep even granted some more time to fix this. Now, almost simultaneously, same time frame, all our new pages, that were already ranking well tx to proper 301 rules, suddenly fell out of the google SERPS, nothing to be found anymore up till now. Our website is live since 1996, no issues, up till now. There seems to be a strong correlation to what happened in our SEA and what happened in our SEO can anyone share some info?
Technical SEO | | TruvoDirectories0