Switching from HTTP to HTTPS and google webmaster
-
HI,
I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well.
Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all.
I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file??
Any help and advice would be much appreciated.
Kind regards
Steve
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 -
Hi Steve! If Peter and Kristen answered your question, make sure to mark their responses as "Good Answers."
-
You have few more things to do:
-
change redirect from 302 to 301 between HTTP and HTTPS sites
-
you need to verify in SearchConsole HTTPS site too and then do "change of address". Change of address can be used also if you switch protocols.
-
you need to change in your pages - canonical, assets, images so everything to point to HTTPS pages/elements. Also internal linking should be only to HTTPS pages. I check 2-3 pages of your site and they're still pointing to HTTP. This give bots wrong signal.
-
setup HSTS header. This will prevent browsers/bots to visit anymore HTTP site for one year:
Header always set Strict-Transport-Security "max-age=63072000; includeSubdomains; preload" put this in .htaccess
-
about errors.txt I think that it's much better if you enable indexing at all. Here is example of mine site:
User-agent: *
Disallow:
Sitemap: http://peter.nikolow.me/sitemap_index.xmlas you can see i enable bot to crawl everything within WordPress folders.
Current you make half moving to HTTPS and this sent to bots wrong signals because site isn't moved proper. Fix everything to avoid wasting of crawling budget.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Resolving 301 Redirect Chains from Different URL Versions (http, https, www, non-www)
Hi all, Our website has undergone both a redesign (with new URLs) and a migration to HTTPS in recent years. I'm having difficulties ensuring all URLs redirect to the correct version all the while preventing redirect chains. Right now everything is redirecting to the correct version but it usually takes up to two redirects to make this happen. See below for an example. How do I go about addressing this, or is this not even something I should concern myself with? Redirects (2) <colgroup><col width="123"><col width="302"></colgroup>
Technical SEO | | theyoungfirm
| Redirect Type | URL |
| | http://www.theyoungfirm.com/blog/2009/index.html 301 | https://theyoungfirm.com/blog/2009/index.html 301 | https://theyoungfirm.com/blog/ | This code below was what we added to our htaccess file. Prior to adding this, the various subdomain versions (www, non-www, http, etc.) were not redirecting properly. But ever since we added it, it's now created these additional URLs (see bolded URL above) as a middle step before resolving to the correct URL. RewriteEngine on RewriteCond %{HTTP_HOST} ^www.(.*)$ [NC] RewriteRule ^(.*)$ https://%1/$1 [R=301,L] RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L] Your feedback is much appreciated. Thanks in advance for your help. Sincerely, Bethany0 -
Google not using redirect
We have a GEO-IP redirect in place for our domain, so that users are pointed to the subfolder relevant for their region, e.g: Visit example.com from the UK and you will be redirected to example.com/uk This works fine when you manually type the domain into your browser, however if you search for the site and come to example.com, you end up at example.com I didn't think this was too much of an issue but our subfolders /uk and /au are not getting ranked at all in Google, even for branded keywords. I'm wondering if the fact that Google isn't picking up the redirect means that the pages aren't being indexed properly? Conversely our US region (example.com/us) is being ranked well. Has anyone encountered a similar issue?
Technical SEO | | ahyde0 -
Google indexing tags help
Hey everyone, So yesterday someone pointed out to me that Google is indexing tags and that will likely hurt search engine results. I just did a "site:thetechblock.com" and I notice that tags are still being pulled. http://d.pr/i/WmE6 Today, I went into my Yoast settings and checked "noindex,follow" tags in the Taxomomies settings. I just want to make sure what I'm doing is right. http://d.pr/i/zmbd Thanks guys
Technical SEO | | ttb0 -
Google.com
Hi We are managing a .com site for a client working on getting the site ranking. The site is hosted in the US. The content is rich, deep and unique. The site is in a competitive market but had begun ranking top 50 for a selection of keywords and we could see many more in the top 100. The site is now going backwards and only has a few keywords ranking top 50 and all the others have disappeared from the rankings all together. Any thought as to what could cause this. The site is managed from the Uk but as mentioned is hosted in the US. No penguin issues as all content unique, rich, relevant and fresh. SEO is also managed from the UK. Thoughts
Technical SEO | | SEOwins0 -
Why is google ignoring my sitelinks demotions?
I'm referring to the sitelinks that appear in the SERPs when searching for my brand name. 6 subpages come up- some found in my main navigation, some not. 3 of the 6 sitelinks have been demoted in Webmaster Tools under "Site Configuration > Sitelinks." I realize that they say in their instructions: "Google doesn't guarantee that demoted URLs will never appear as a sitelink, but we do consider a demotion a strong hint that we'll try to honor when generating sitelinks." I am just surprised they would ignore 3 of the demotions and for so long. The pages that are demoted do not have very many internal links pointing to them (unlike other pages of my site that are targeted specifically). Also, the site has tens of thousands of pages to choose from. Why are they ignoring my request? What else can I do to fix this?
Technical SEO | | Hakkasan0 -
Google plus
" With a single Google search, you can see regular search results, along with all sorts of results that are tailored to you -- pages shared with you by your friends, Google+ posts from people you know" Would i be able to see my own post which i shared with someone in my Google plus circle, when i do a search ?
Technical SEO | | seoug_20050 -
Disallowing https URLs
It there a problem disallowing all https URLs to be indexed in order to avoid duplication? This is the article recommending this practice - http://blog.leonardchallis.com/seo/serve-a-different-robots-txt-for-https/ Thanks!
Technical SEO | | theLotter0 -
About Google Spider
Hello, people! I have some questions regarding on Google spider. Many people are saying that "Google spiders only have US IP address." Is this really true? But I also saw video from Google's offical blog and it said "Google spider come from all around the world." At this point I am really confused. Q1) I researched and it seems like Google spiders have only US IP address. THen what does exactly mean by "Google spider come from all around the world."? Q2) If Google spider have only US IP address, what happen to site which use IP delivery? Is this means that Google spider always redirect to us site since they only have US IP? Can anyone help me to understand?? One more questions! When Google analyzing for cloaking issue, do you think Google analyze when spider crawls the site or after they crawled the site?
Technical SEO | | Artience0