HTTP URLs Still in Index
-
One of the sites I manage was migrated to secure 2 months ago. XML sitemaps have been updated, canonical tags all have https:, and a redirect rule was applied. Despite all this, I'm still seeing non-secure URLs in Google's index. The weird thing is, when I click those links, they go to the secure version.
Has anyone else seen weird things with Google not properly indexing secure versions of URLs?
-
Just make certain your http > https redirect is a 301, not a 302 (A lot of the examples on the web for this redirects are actually wrong in this regard). A 302 will make it difficult to get the old URLs to drop out of the index.
-
Thanks for your input Casey. I'm dealing with about 4,000 URLs, so I think I'll go with the patience method!
-
Hey Logan, I wouldn't say this is a "strange behavior." I've seen it take several months for all non-http URLs to drop out of the index when making a SSL switch.
That being said, you "could" go ahead and do a fetch request on the individual URLs in the old HTTP verified Google Search Console account and get Google to follow again the URLS to their HTTPS counterparts. That certainly won't hurt.
I had a similar issue with a client site who moved to HTTPS but forgot to noindex the dev site and all of that was indexed. They had 700+ URLs in their account in April and they still have 4 of those URLs in search as of today. It just takes awhile for all that stuff to dropout of the index.
Personally, I'd just be patient. You having the redirects in place is what is important. Google will sort it out...eventually.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Accidental No Index
Hi everyone, We control several client sites at my company. The developers accidentally had a no index robot implemented in the site code when we did the HTTPS upgrade without knowing it (yes it's true). Ten days later we noticed traffic was falling. After a couple days we found the no index tags and removed them and resumbitted the sitemaps. The sites started ranking for their own keywords again within a day or two. The organic traffic is still down considerably and other keywords they are not ranking for in the same spot as they were before or at all. If I look in Google Search console, it says we submitted for example 4,000 URLs and only 160 have been indexed. I feel like maybe Google is taking a long time to re-index to remainder of the sites?? Has anyone has this issue?? We're starting to get very concerned so any input would be appreciate. I read an article on here from 2011 about a company that did the same and they were ranking for their keywords within a week. It's been 8 days since our fix.
Technical SEO | | AliMac260 -
Https & http
I have my website (HTTP://thespacecollective.com) marked on Google Webmaster Tools as being the primary domain, as opposed to https. But should all of my on page links be http? For instance, if I click the Home button on my home page it will take the user to http, but if you type in the domain name in the address bar it will take you to https. Could this be causing me problems for SEO?
Technical SEO | | moon-boots0 -
URL Parameters
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+ Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled. Our robotos.txt files shows now: # Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
Technical SEO | | Happy-SEO1 -
Google is indexing my directories
I'm sure this has been asked before, but I was looking at all of Google's results for my site and I found dozens of results for directories such as: Index of /scouting/blog/wp-includes/js/swfupload/plugins Obviously I don't want those indexed. How do I prevent Google from indexing those? Also, it only seems to be doing it with Wordpress, not any of the directories on my main site. (We have a wordpress blog, which is only a portion of the site)
Technical SEO | | UnderRugSwept0 -
Multiple URLs
I'm trying to check the URLs of this site- http://www.ofo.com.au, and I see that their old site has 301 re-directed to it...but the site http://ofo.com.au and http://outdoorfurnitureoutlet.com.au are both still up and I can't see any 301 redirects from them. Is it a problem even if when I do a site: search for them I get no results?
Technical SEO | | UnaRealidad0 -
Getting More Pages Indexed
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed. Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed. syGtx.png
Technical SEO | | Mattchstick0 -
Google Indexed URLs for Terms Have Changed Causing Huge SERP Drop
We haven't made any significant changes to our website, however the pages that google has indexed for our critical keywords have changed to pages that have caused our SERP to drop dramatically for those pages. In some cases, the changes make no sense at all. For example, one of our terms that used to be indexed to our homepage is now indexed to a dead category page that has nothing on it. One of our biggest terms, where we were 9th, changed and is now indexed to our FAQ. As a result, we now rank 44th. This is having a MAJOR impact on our business so any help on why this sudden change happened and what we can do to combat it is greatly appreciated.
Technical SEO | | EvergladesDirect0 -
Root vs. Index.html
Should I redirect index.html to "/" or vice versa? Which is better for duplicate content issues?
Technical SEO | | DavetheExterminator0