Moving to TLS and disavow file
-
I'm considering the move to TLS/SSL obviously will be setting up the version in Search Console, do I need to re-upload the disavowal file previously generated before the move?
Look forward to your response.
-
I appreciate your comprehensive article, However, may I kindly point out my question was to do with Disavow in Google Search Console, Not the implementation of secure.
-
1. Get and Install Certificates
Buy a 2048-bit TLS/SSL SHA-2 secure certificate from a Certificate Authority (CA)
Generate some documents so that the CA can issue a signed certificate
Send the CA what they need (your public key and certificate signing request)
Install certificates on your servers
2. Enable HTTPS on Your Servers
Configure your server for HTTPS. Check out these configuration tips for popular servers.
Test properly functioning using an external testing tool. Here’s a good one.
Set a reminder to update your secure certificate before it expires.
3. Code & Configuration Changes
Update site content to request https resources
Update internal links to point to https pages or consider making internal links relative
Use protocol relative URIs. Example: (see note below)
Add self-referencing rel canonical tag to every page, pointing to your HTTPS URIs
Change all Ad calls to work with HTTPS
Update any internal tools, such as Optimizely or CrazyEgg, to work with HTTPS
Update legacy redirects to eliminate chained redirects (see note below)
Update OpenGraph, Schema, Semantic markup etc. to point to HTTPS
Update social sharing buttons to preserve share counts
4. Robots.txt, XML Sitemaps, Search Console and Analytics
Create and verify a new property for the HTTPS site in Google Search Console
Create a new XML sitemap file that points to your HTTPS URLs and upload it to the new property in Search Console
Create a new robots.txt file for the HTTPS site and copy over all existing rules. Include a Sitemap link to the new HTTPS XML sitemap.
Remove all rules from the HTTP robots.txt file, except for the Sitemap link, and leave it in place. This is to encourage bots to crawl and follow all redirects.
Copy any existing disavow file and upload it to the new HTTPS property in Search Console
Note: Don’t use the “Change of Address” feature in Google Search Console. That’s used for migrations to new domains.
5. Redirect HTTP to HTTPs
Deploy the redirect code
Redirect HTTP to HTTPS on IIS (7.x and higher)
Redirect HTTP to HTTPS on Apache
Redirect HTTP to HTTPS on Nginx
Include exceptions to any global redirect directives for your existing robots.txt and XML sitemap files
6. Follow-Up (after the release)
Use a tools, like SSL Check, to scan your site for non-secure content
Check HTTPS redirects and legacy redirects to ensure they work correctly. Check for long redirect chains using a tools that captures the header responses (I like Redirect Path by Ayima). Check for proper redirect functionality from both www and non-www, with and without trailing slashes, etc.
Use “Fetch as Google” tool and submit your Home page and other key pages to speed up the indexing process. I use the “Crawl this URL and its direct links” option.
Monitor the Index Status report in Search Console. The HTTP property should eventually go to zero, and the HTTPS should increase. Take this a step further by calculating the indexation rates of each XML sitemap and monitor them over time.
Monitor the Crawl Errors report in Search Console and address errors, as appropriate
When most new (HTTPS) URLs are already indexed, remove the legacy sitemap link from Robots.txt
Update incoming links that are within your control to point to HTTPS (eg. links to your site from social media profiles)
7. Turn on Strict Transport Security (HSTS)
Once you’re absolutely sure the entire site is working with HTTPS, use HSTS to improve performance by ensuring the browser “remembers” to send all requests to your site to https based on a policy you set. Keep in mind that this means your site will only use HTTPS, so make sure it works! (source).
here is another an good written guide for you.
-
Hey there,
You absolutely do need to. One of the biggest mistakes people make is not migrating their disavow file when they switch domains/encryptions.
Sean
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving from no follow to follow links on our eCommerce site
Hi everyone, I recently taken on an SEO eCommerce account and found that all the footer links have a no follow attribute. I've requested that the no follow tags be removed as the pages are quite valuable (about us, finance, recycling, help centre etc). I've been asked what the risks are and all I can think of is a slightly increased number of pages for Google to Crawl. Are there any other risks you can think of? Does anyone have experience around making this type of change? For benefits, I believe that it will make our content look more trustworthy to Google and help with traffic through to those pages in the SERPs. Any other pros you can think of will be a great help.
Technical SEO | | RebekahVP0 -
Do I need a separate robots.txt file for my shop subdomain?
Hello Mozzers! Apologies if this question has been asked before, but I couldn't find an answer so here goes... Currently I have one robots.txt file hosted at https://www.mysitename.org.uk/robots.txt We host our shop on a separate subdomain https://shop.mysitename.org.uk Do I need a separate robots.txt file for my subdomain? (Some Google searches are telling me yes and some no and I've become awfully confused!
Technical SEO | | sjbridle0 -
Robots file set up
The robots file looks like it has been set up in a very messy way.
Technical SEO | | mcwork
I understand the # will comment out a line, does this mean the sitemap would
not be picked up?
Disallow: /js/ should this be allowed like /*.js$
Disallow: /media/wysiwyg/ - this seems to be causing alerts in webmaster tools as it can not access
the images within.
Can anyone help me clean this up please #Sitemap: https://examplesite.com/sitemap.xml Crawlers Setup User-agent: *
Crawl-delay: 10 Allowable Index Mind that Allow is not an official standard Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Allow: /catalogsearch/result/ Allow: /media/catalog/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /errors/
Disallow: /includes/
Disallow: /js/
Disallow: /lib/
Disallow: /magento/ Disallow: /media/ Disallow: /media/captcha/ Disallow: /media/catalog/ #Disallow: /media/css/
#Disallow: /media/css_secure/
Disallow: /media/customer/
Disallow: /media/dhl/
Disallow: /media/downloadable/
Disallow: /media/import/
#Disallow: /media/js/
Disallow: /media/pdf/
Disallow: /media/sales/
Disallow: /media/tmp/
Disallow: /media/wysiwyg/
Disallow: /media/xmlconnect/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /scripts/
Disallow: /shell/
#Disallow: /skin/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalog/product/gallery/
Disallow: */catalog/product/upload/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt
Disallow: /get.php # Magento 1.5+ Paths (no clean URLs) #Disallow: /.js$
#Disallow: /.css$
Disallow: /.php$
Disallow: /?SID=
Disallow: /rss*
Disallow: /*PHPSESSID Disallow: /:
Disallow: /😘 User-agent: Fatbot
Disallow: / User-agent: TwengaBot-2.0
Disallow: /0 -
Should I use the Google disavow tool?
Hi I'm a bit new to SEO and am looking for some guidance. Although there is no indication in Webmaster tools that my site is being penalised for bad links, I have noticed that I have over 200 spam links for "Pay Day Loans" pointing to my site. (This was due to a hack on my site several years ago). So my question is two fold. Firstly, is it normal to have spammy links pointing to your site and secondly, should I bother to do anything about it? I did some research into the Disavow tool in Webmaster tools wonder I should use it to block all these links. Thanks
Technical SEO | | hotchilidamo0 -
Moving Content From One Site To Another
Generally speaking if I am just moving a couple of articles from one site to another I need to 301 redirect those old URL's to the new ones right? And even if a webpage doesn't have any links pointing to it, it is best practice to employ 301 redirects correct? After a while, after Google etc. has crawled the new location of the content you can then delete the old URL, is that right? And if other sites are linking to the old location they should be notified of the new location but even if a page has links pointing to it, is it best practice to delete that page after Google has crawled the new and you've notified the webmaster? I've think I've got this right, I just want some clarification on this issue. Thanks.
Technical SEO | | ThridHour0 -
Disavow file and backlinks listed in webmaster tools
Hi guys, I've sent a disavow file via webmaster tools. After that, should the backlinks from domains listed in that file disappear from the list of links to my website in webmaster tools? Or does webmaster tools show all the links, whether I've sent disavow file or not?
Technical SEO | | superseopl0 -
Moving from a .com to .co.uk
I need to migrate a wordpress site from domainname.com to domainname.co.uk. If I just put a 301 on every page on the .com will that cover it? Would it make sense to go and change all the backlinks/profile links to the new .co.uk site or doesn't it matter if you have a 301 redirect on it? Thanks
Technical SEO | | littlesthobo0 -
Moving to Dynamic IP
Hi all, We are going to use CDN with geographically distributed IPs. However the website has strong positions in local search in UK and in regular search for geo kwds. Is it possible that with moving to from UK static IP to dynamic IPs can affect positions in Google? Thanks, Jane
Technical SEO | | Jane_Barry0