Moving to TLS and disavow file
-
I'm considering the move to TLS/SSL obviously will be setting up the version in Search Console, do I need to re-upload the disavowal file previously generated before the move?
Look forward to your response.
-
I appreciate your comprehensive article, However, may I kindly point out my question was to do with Disavow in Google Search Console, Not the implementation of secure.
-
1. Get and Install Certificates
Buy a 2048-bit TLS/SSL SHA-2 secure certificate from a Certificate Authority (CA)
Generate some documents so that the CA can issue a signed certificate
Send the CA what they need (your public key and certificate signing request)
Install certificates on your servers
2. Enable HTTPS on Your Servers
Configure your server for HTTPS. Check out these configuration tips for popular servers.
Test properly functioning using an external testing tool. Here’s a good one.
Set a reminder to update your secure certificate before it expires.
3. Code & Configuration Changes
Update site content to request https resources
Update internal links to point to https pages or consider making internal links relative
Use protocol relative URIs. Example: (see note below)
Add self-referencing rel canonical tag to every page, pointing to your HTTPS URIs
Change all Ad calls to work with HTTPS
Update any internal tools, such as Optimizely or CrazyEgg, to work with HTTPS
Update legacy redirects to eliminate chained redirects (see note below)
Update OpenGraph, Schema, Semantic markup etc. to point to HTTPS
Update social sharing buttons to preserve share counts
4. Robots.txt, XML Sitemaps, Search Console and Analytics
Create and verify a new property for the HTTPS site in Google Search Console
Create a new XML sitemap file that points to your HTTPS URLs and upload it to the new property in Search Console
Create a new robots.txt file for the HTTPS site and copy over all existing rules. Include a Sitemap link to the new HTTPS XML sitemap.
Remove all rules from the HTTP robots.txt file, except for the Sitemap link, and leave it in place. This is to encourage bots to crawl and follow all redirects.
Copy any existing disavow file and upload it to the new HTTPS property in Search Console
Note: Don’t use the “Change of Address” feature in Google Search Console. That’s used for migrations to new domains.
5. Redirect HTTP to HTTPs
Deploy the redirect code
Redirect HTTP to HTTPS on IIS (7.x and higher)
Redirect HTTP to HTTPS on Apache
Redirect HTTP to HTTPS on Nginx
Include exceptions to any global redirect directives for your existing robots.txt and XML sitemap files
6. Follow-Up (after the release)
Use a tools, like SSL Check, to scan your site for non-secure content
Check HTTPS redirects and legacy redirects to ensure they work correctly. Check for long redirect chains using a tools that captures the header responses (I like Redirect Path by Ayima). Check for proper redirect functionality from both www and non-www, with and without trailing slashes, etc.
Use “Fetch as Google” tool and submit your Home page and other key pages to speed up the indexing process. I use the “Crawl this URL and its direct links” option.
Monitor the Index Status report in Search Console. The HTTP property should eventually go to zero, and the HTTPS should increase. Take this a step further by calculating the indexation rates of each XML sitemap and monitor them over time.
Monitor the Crawl Errors report in Search Console and address errors, as appropriate
When most new (HTTPS) URLs are already indexed, remove the legacy sitemap link from Robots.txt
Update incoming links that are within your control to point to HTTPS (eg. links to your site from social media profiles)
7. Turn on Strict Transport Security (HSTS)
Once you’re absolutely sure the entire site is working with HTTPS, use HSTS to improve performance by ensuring the browser “remembers” to send all requests to your site to https based on a policy you set. Keep in mind that this means your site will only use HTTPS, so make sure it works! (source).
here is another an good written guide for you.
-
Hey there,
You absolutely do need to. One of the biggest mistakes people make is not migrating their disavow file when they switch domains/encryptions.
Sean
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge increase in links to your site when moving to SSL
Hi My client has 2 websites that after moving them to SSL the number of links to your site in the search console increased in 10s of thousands. What can be the reasons?
Technical SEO | | digital19740 -
Moving site from html to Wordpress site: Should I port all old pages and redirect?
Any help would be appreciated. I am porting an old legacy .html site, which has about 500,000 visitors/month and over 10,000 pages to a new custom Wordpress site with a responsive design (long overdue, of course) that has been written and only needs a few finishing touches, and which includes many database features to generate new pages that did not previously exist. My questions are: Should I bother to port over older pages that are "thin" and have no incoming links, such that reworking them would take time away from the need to port quickly? I will be restructuring the legacy URLs to be lean and clean, so 301 redirects will be necessary. I know that there will be link juice loss, but how long does it usually take for the redirects to "take hold?" I will be moving to https at the same time to avoid yet another porting issue. Many thanks for any advice and opinions as I embark on this massive data entry project.
Technical SEO | | gheh20130 -
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
Disavow questions
Pretty sure I know the answers to these but someone asked me to make absolutely sure so here goes, any opinions welcome: If i disavow a whole domain does it include all sub-domains on the domain also?- my answer is clearly yes. If i have network of links really bad linking to my website that are already nofollow but awful websites to be linked on, is it worth putting them in the disavow list anyway to basically tell Google literally no association? I know the whole point of disavow is to essentially nofollow the link. Opinions much appreciated, thank you guys.
Technical SEO | | tdigital0 -
Google disavow tool ( how long does it take ? )
Hello, I disavowed some of my links about three months but still see them in my link profile, using OSE? How long does it take for Google to make them nofollow. Thanks
Technical SEO | | mezozcorp0 -
Where Is This Being Addended to Our Page File Names?
I have worked over the last several months to eliminate duplicate page titles at our site. Below is one situation that I need your advice on. Google Webmaster Tools is reporting several of our pages with
Technical SEO | | lbohen
duplicate title such as this one: This is a valid page at our Web store: http://www.audiobooksonline.com/159179126X.html This is an invalid page that Google says is a duplicate of the one above: http://www.audiobooksonline.com/159179126X.html?gdftrk=gdfV2138_a_7c177_a_7c432_a_7c9781591791263 Where might the code ?gdftrk=.... be coming from? How to get rid of it?0 -
Converting files from .html to .php or editing .htaccess file
Good day all, I have a bunch of files that are .html and I want to add some .php to them. It seems my 2 options are Convert .html to .php and 301 redirect or add this line of code to my .htaccess file and keep all files that are .html as .html AddType application/x-httpd-php .html My gut is that the 2nd way is better so as not alter any SEO rankings, but wanted to see if anybody had any experience with this line of code in their .htaccess file as definitely don't wan to mess up my entire site 🙂 Thanks for any help! John
Technical SEO | | JohnHerrigel0 -
What are the pros and cons of moving one site onto a subdomain of another site?
Two sites. One has weaker sales. What would the benefits and problems for SEO of moving the weak site from its own domain to a subdomain of the stronger site?
Technical SEO | | GriffinHansen0