Issues with Google Search Console and rekeyed SSL certificate
-
Hi,
Another newbie question please. I've recently changed the name of my business so bought a new domain and rekeyed the SSL certificate to the new domain.
Let's say the old domain was called https://123.com and the new one is https://abc.com.
I've set up a 301 redirect on 123.com to forward to abc.com and I've added the new domain to Google Search Console and verified it, however can't seem to use the Change of Address tool to move from the old domain to the new domain.
I think its because my preferred property (https://123.com) technically no longer exists since I rekeyed the SSL certificate from the old site to the new one so the old site no longer has an SSL certificate. When I go to the old https domain it doesn't load, nor does it seem to forward to the new site. It just times out.
Am I correct in assuming that since I rekeyed the SSL certificate, that my original preferred property on Google (https://123.com) no longer exists?
And if so, is there a way to use the Change of Address tool or do I simply need to remove the old site from Google and go through a period where my (new) site builds it's ranking from scratch?
Thanks in advance folks!
-
Good luck! Happy to help
-
Thanks Robin,
That's as I suspected. I'll see if I can grab a cheap SSL for the old site somewhere to see if I can rectify the situation.
Thanks for your help!
Mike.
-
HI resourcerep, it's great that you're asking questions on the community.
Have you been using this help article? I'm just going to check some assumptions here:
- Have you set up a search console property for the new site so you can select it during the site move operation?
- What errors (if any) do you get from search console?
- What happens when you try to access the old pages, do you just get a warning?
It sounds like the SSL cert is the issue. Unfortunately due to the way HTTPS works it needs a SSL certificate before you can send and receive data.
However, you could find the cheapest SSL cert you can buy, and put that on the old page to allow the redirect to work. To be honest that'll be valuable for more than just this, as having working 301s will help both searcg crawlers and users trying to access the old site.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Make my Site Appear in google search like the attached image?
Whenever I search for my website, All of the pages appears as different search results. However, I want my website to appear as most of the website out there appears. I have attached an image for your understanding (that's what exactly I want to get). I Shall be very thankful! Oycm2Kd
Technical SEO | | primecoursesfree1 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Google how deal with licensed content when this placed on vendor & client's website too. Will Google penalize the client's site for this ?
One of my client bought licensed content from top vendor of Health Industry. This same content is on the vendor's website & my client's site also but on my site there is a link back to vendor is placed which clearly tells to anyone that this is a licensed content & we bought from this vendor. My client bought paid top quality content for best source of industry but at this same this is placed on vendor's website also. Will Google penalize my client's website for this ? Niche is HEALTH
Technical SEO | | sourabhrana1 -
I've had a sudden a increase in crawl issues as of yesterday (like 300 from a steady 10, does anyone else have this issue?
the main issue is that it's now indexing both www and http:// - anyone else got this issue or had any changes suddenly on their crawl results?
Technical SEO | | beckyhy0 -
Quality Issues: My blog is blocked on Google Search Engine
Hi Webmasters, I got an email from google team. The email is included below. **Google Webmaster Tools: Quality Issues on http://abcdblogger.com/**August 8, 2012 Dear site owner or webmaster of http://abcdblogger.com/, We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines. If you have any questions about how to resolve this issue, please see ourWebmaster Help Forum for support. Sincerely, Google Search Quality Team My blog is completely blocked on Google Search engine. I removed all existing posts and reinstalled a fresh version of wordpress and wrote a good article. I redirected all broken links my homepage with a 301. After making those changes I submitted a reconsideration request to Google, But they declined it. I doubt that the reason for blocking could be due to the backlinks pointing to my domain. I think Google's Disavow Tool help me to remove low quality backlinks, But how can I sort low quality backlinks using Opensite Explorer? If possible can you create a text file with all possible low quality links, So that I could submit it using Google Disavow Tool. Thanks.
Technical SEO | | hafiskani0 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Google Plus 1
How google plus 1 votes can affect your position in google search engine? On my website i have a google plus button, i have just 3 votes, if i buy lets say 100 votes, but slowly, in a month, will this affect my ranking positively?
Technical SEO | | prunarevic0 -
Google showing former meta tags in search results inspite of new tags being crawled by it
I had changed the meta tags for a site www.aztexsodablast.com.au about a month back and Google has also crawled those new tags but in search results when I search for the term 'Aztex Sodablast' it is continuing to show the old tags while on the site, the new tags are being displayed. What may be the issue and how could I correct the problem?
Technical SEO | | pulseseo0