Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
-
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic.Any recommendations would be much appreciated.
-
You could use a tool like Xenu to check that all of the urls on your clients website do indeed now redirect to https, or to be extra sure, you should add a redirect to the websites .htaccess (or web.conf on windows hosting) to force a 301 redirect on all http pages to https.
As for indexing http, if they redirect to https correctly when you view them, then you'll just have to wait for google to catch up. No harm done in the mean time.
-
Redirect with 301 the old urls to the new https ones. Check gwt for any crawl errors for weeks if not months (depends on how much index pages you have and how many are crawled per day) and fix them, if you have.
you should then see traffic move from http to https in gwt
again depending on the total number of pages indexed you have and the crawling rate you have, it make takes months, even many months before all pages are re-indexed and starts showing in SERP as https
but that's normal and it's not bad at all
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I know which keywords lost their top rankings on google a year ago if the client didn't checked the keyword rankings in his website?
Hi, Can I know which keywords lost their top rankings on google a year ago if the client didn't checked the keyword rankings in his website? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Is Chamber of Commerce membership a "paid" link, breaking Google's rules?
Hi guys, This drives me nuts. I hear all the time that any time value is exchanged for a link that it technically violates Google's guidelines. What about real organizations, chambers of commerce, trade groups, etc. that you are a part of that have online directories with DO-follow links. On one hand people will say these are great links with real value outside of search and great for local SEO..and on the other hand some hardliners are saying that these technically should be no-follow. Thoughts???
Intermediate & Advanced SEO | | RickyShockley0 -
Google index
Hello, I removed my site from google index From GWT Temporarily remove URLs that you own from search results, Status Removed. site not ranking well in google from last 2 month, Now i have question that what will happen if i reinclude site url after 1 or 2 weeks. Is there any chance to rank well when google re index the site?
Intermediate & Advanced SEO | | Getmp3songspk0 -
Old pages STILL indexed...
Our new website has been live for around 3 months and the URL structure has completely changed. We weren't able to dynamically create 301 redirects for over 5,000 of our products because of how different the URL's were so we've been redirecting them as and when. 3 months on and we're still getting hundreds of 404 errors daily in our Webmaster Tools account. I've checked the server logs and it looks like Bing Bot still seems to want to crawl our old /product/ URL's. Also, if I perform a "site:example.co.uk/product" on Google or Bing - lots of results are still returned, indicating the both still haven't dropped them from their index. Should I ignore the 404 errors and continue to wait for them to drop off or should I just block /product/ in my robots.txt? After 3 months I'd have thought they'd have naturally dropped off by now! I'm half-debating this: User-agent: *
Intermediate & Advanced SEO | | LiamMcArthur
Disallow: /some-directory-for-all/* User-agent: Bingbot
User-agent: MSNBot
Disallow: /product/ Sitemap: http://www.example.co.uk/sitemap.xml0 -
Http - Https Issue
Hey there Mozzers, I have a site that few months ago went from being http - https. All the links redirect perfect but after scanning my site with Screaming Frog i get a bunch of 503 errors. After looking into my website I see that a lot of links in my content and menu have as a link the http url. For example my homepage has content that interlinks to the http version of the site. And even though when I test it it redirects correctly after scanning with Screaming frog it reports back as 503. Any ideas what's going on? Thanks in advance
Intermediate & Advanced SEO | | Angelos_Savvaidis0 -
How can I get a list of every url of a site in Google's index?
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
Does Google index more than three levels down if the XML sitemap is submitted via Google webmaster Tools?
We are building a very big ecommerce site. The site has 1000 products and has many categories/levels. The site is still in construccion so you cannot see it online. My objective is to get Google to rank the products (level 5) Here is an example level 1 - Homepage - http://vulcano.moldear.com.ar/ Level 2 - http://vulcano.moldear.com.ar/piscinas/ Level 3 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/ Level 4 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes.html/ Level 5 - Product is on this level - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes/autocebante-recomendada-para-filtros-vc-10.html Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
What NAP format do I use if the USPS can't even find my client's address?
My client has a site already listed on Google+Local under "5208 N 1st St". He has some other NAPs, e.g., YellowPages, under "5208 N First Street". The USPS finds neither of these, nor any variation that I can possibly think of! Which is better? Do I just take the one that Google has accepted and make all the others like it as best I can? And doesn't it matter that the USPS doesn't even recognize the thing? Or no? Local SEO wizards, thanks in advance for your guidance!
Intermediate & Advanced SEO | | rayvensoft0