Moving from http to https - what do I need to do in Google Search Console?
-
Hi all, I have moved my site from http to https.
I current have two profiles in Google Search Console:
http://mysite.com
http://www.mysite.comDo I need to set up the same but with https and if so, what do I then do with the http profiles? Do I delete them? Or just remove the sitemaps? Confused.
-
I also leave the old profiles. My guess is that if Google does try and crawl the old sitemaps, they'll hit the redirects and will index the new urls quicker.
It also helps if you have any urls that have somehow avoided the move.
-
Personally, I leave the old profiles for a few months or until the new sitemaps are fully Indexed.
I haven't heard anything about any problem with deleting or not deleting the old profiles.
Best luck.
GR.
-
Thanks all - but what do you do with the 'old' http profile in GSC. Do you remove the sitemap, delete the profile? Alter any settings?
Thanks.
-
Hi there,
Yes you have to create new profiles with the HTTPS urls. And, of course load the sitemaps in the new profiles (those sitemaps must also be in HTTPS).
Be sure to have done the reditects correctly.Here, an excellent article and a checklist on everthing you should do in a HTTPS migration.
The HTTP to HTTPs Migration Checklist in Google Docs to Share, Copy & Download, from Aleyda Solis.
Hope this helped you.
Best luck.
GR. -
Hello
I've been researching about it
At this link you can find all the related information
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Many errors in Search Console (strange parameters)
Hello, I have many strange parameters in my search console that make many 404 pages, for example: mywebsite.com/article-name/&ct=ga&cd=CAIyGjk4YjY4ZDExNTYxOTgzZTk6Y29tOmVuOlVT&usg=AFQjCNFvpYpYpYf9DoyRBBu8jbiQB8JcIQ mywebsite.com/article-name/&sa=U&ved=0ahUKEwj1zMLR0JbLAhUGM5oKHejjBJAQqQIILSgAMAk&usg=AFQjCNEBNFx3dG5B0-16X6eXTS7k-Srm6Q Can someone tell me how to solve it?
Technical SEO | | JohnPalmer0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Robots.txt on http vs. https
We recently changed our domain from http to https. When a user enters any URL on http, there is an global 301 redirect to the same page on https. I cannot find instructions about what to do with robots.txt. Now that https is the canonical version, should I block the http-Version with robots.txt? Strangely, I cannot find a single ressource about this...
Technical SEO | | zeepartner0 -
How to rank in Google Places
Normally, I don't have a problem with local SEO (more of a multi-channel sort of online marketing guy) but this one has got me scratching my head. Look at https://www.google.co.uk/search?q=wedding+venues+in+essex Theres two websites there (fennes and quendon park) that both have a much more powerful DA but don't appear in the Google Places (Google + Business or whatever it's labeled as). Why are websites such as Boreham house ranking top in the map listings? Quendon Park has a Google places listing, it's full of content, the NAP all matches up. Its a stronger website. Boreham House isn't any closer to the centroid than Quendon Park Just got me struggling this one
Technical SEO | | jasonwdexter0 -
Google not showing profile photo in search
I seen on google snippet review and it was showing profile photo in search but when i see directly on google.com it is not showing there.. what's the problem ?
Technical SEO | | xplodeguru0 -
Google indexing thousands crazy search results with %25253
In GWT I started seeing very strange pages indexed a few weeks, and Google is no reporting over 21,000 of pages (blocked by robots.txt) with weird URLs like this: http://www.francesphotography.com/?s=no-results:no-results%25252525252525253Ano-results%2525252525252525253Ano-results%252525252525252525253Ano-results%252525252525252525253Ano-results%252525252525252525253Ano-results%252525252525252525253Ano-results%25252525252525252525253Ano-results%25252525252525252525253Ano-results%2525252525252525252525253Adanna&cat=no-results http://www.francesphotography.com/?s=no-results:no-results%2525253Ano-results%25252525253Ano-results%25252525253Ano-results%25252525253Ano-results%2525252525253Ano-results%25252525252525253Ano-results%25252525252525253Ano-results%25252525252525253Adanna&cat=no-results The current robots.txt looks like this: User-agent: *
Technical SEO | | BoulderJoe
Disallow: /wp-content Disallow: /wp-admin Disallow: /wp-includes
Disallow: /data
Disallow: /slideshows
Disallow: /page/*/?s=
Disallow: /?s=
Disallow: /search This website is running an up to date WP install with Yoast's Google Analytics and SEO plug-in. I can't point to anything specific that happened with the site when these URLs started appearing even after I modified the robots.txt. What can be done to try and stop Google from creating and indexing these goofy URLs? I see lots of sites having this issue when I search in Google, but no one seems to have a solution.0 -
UK website ranking higher in Google.com than Google.co.uk
Hi, I have a UK website which was formerly ranked 1<sup>st</sup> in Google.co.uk and .com for my keyword phrase and has recently slipped to 6<sup>th</sup> in .co.uk but is higher in position 4 in Google.com. I have conducted a little research and can’t say for certain but I wonder if it is possible that too many of my backlinks are US based and therefore Google thinks my website is also US based. Checked Google WmT and we the geo-targeted to the UK. Our server is also UK based. Does anyone have an opinion on this? Thanks
Technical SEO | | tdsnet0 -
Duplicate exact match domains flagged by google - need help reinclusion
Okay I admit, I've been naughty....I have 270+ domains that are all exact match for city+keyword and have built tons of back links to all of them. I reaped the benefits....and now google has found my duplicate templates and flagged them all down. Question is, how to get the reincluded quickly? Do you guys think converting a site to a basic wordpress template and then simply using 275 different templates and begging applying each site manually would do it, or do you recommend. 1. create a unique site template for each site 2. create unique content any other advice for getting reincluded? Aside from owning up and saying, "hey i used the same template for all the sites, and I have created new templates and unique content, so please let me back".
Technical SEO | | ilyaelbert3