Moving to TLS and disavow file
-
I'm considering the move to TLS/SSL obviously will be setting up the version in Search Console, do I need to re-upload the disavowal file previously generated before the move?
Look forward to your response.
-
I appreciate your comprehensive article, However, may I kindly point out my question was to do with Disavow in Google Search Console, Not the implementation of secure.
-
1. Get and Install Certificates
Buy a 2048-bit TLS/SSL SHA-2 secure certificate from a Certificate Authority (CA)
Generate some documents so that the CA can issue a signed certificate
Send the CA what they need (your public key and certificate signing request)
Install certificates on your servers
2. Enable HTTPS on Your Servers
Configure your server for HTTPS. Check out these configuration tips for popular servers.
Test properly functioning using an external testing tool. Here’s a good one.
Set a reminder to update your secure certificate before it expires.
3. Code & Configuration Changes
Update site content to request https resources
Update internal links to point to https pages or consider making internal links relative
Use protocol relative URIs. Example: (see note below)
Add self-referencing rel canonical tag to every page, pointing to your HTTPS URIs
Change all Ad calls to work with HTTPS
Update any internal tools, such as Optimizely or CrazyEgg, to work with HTTPS
Update legacy redirects to eliminate chained redirects (see note below)
Update OpenGraph, Schema, Semantic markup etc. to point to HTTPS
Update social sharing buttons to preserve share counts
4. Robots.txt, XML Sitemaps, Search Console and Analytics
Create and verify a new property for the HTTPS site in Google Search Console
Create a new XML sitemap file that points to your HTTPS URLs and upload it to the new property in Search Console
Create a new robots.txt file for the HTTPS site and copy over all existing rules. Include a Sitemap link to the new HTTPS XML sitemap.
Remove all rules from the HTTP robots.txt file, except for the Sitemap link, and leave it in place. This is to encourage bots to crawl and follow all redirects.
Copy any existing disavow file and upload it to the new HTTPS property in Search Console
Note: Don’t use the “Change of Address” feature in Google Search Console. That’s used for migrations to new domains.
5. Redirect HTTP to HTTPs
Deploy the redirect code
Redirect HTTP to HTTPS on IIS (7.x and higher)
Redirect HTTP to HTTPS on Apache
Redirect HTTP to HTTPS on Nginx
Include exceptions to any global redirect directives for your existing robots.txt and XML sitemap files
6. Follow-Up (after the release)
Use a tools, like SSL Check, to scan your site for non-secure content
Check HTTPS redirects and legacy redirects to ensure they work correctly. Check for long redirect chains using a tools that captures the header responses (I like Redirect Path by Ayima). Check for proper redirect functionality from both www and non-www, with and without trailing slashes, etc.
Use “Fetch as Google” tool and submit your Home page and other key pages to speed up the indexing process. I use the “Crawl this URL and its direct links” option.
Monitor the Index Status report in Search Console. The HTTP property should eventually go to zero, and the HTTPS should increase. Take this a step further by calculating the indexation rates of each XML sitemap and monitor them over time.
Monitor the Crawl Errors report in Search Console and address errors, as appropriate
When most new (HTTPS) URLs are already indexed, remove the legacy sitemap link from Robots.txt
Update incoming links that are within your control to point to HTTPS (eg. links to your site from social media profiles)
7. Turn on Strict Transport Security (HSTS)
Once you’re absolutely sure the entire site is working with HTTPS, use HSTS to improve performance by ensuring the browser “remembers” to send all requests to your site to https based on a policy you set. Keep in mind that this means your site will only use HTTPS, so make sure it works! (source).
here is another an good written guide for you.
-
Hey there,
You absolutely do need to. One of the biggest mistakes people make is not migrating their disavow file when they switch domains/encryptions.
Sean
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving content to a new domain
I need to move a lot of content with podcasts and show notes to a new domain. Instead of doing redirects, we want to keep some content on the current domain to retain the link value. There are business reason to keep content on both websites but the new website will primarily be used for SEO moving forward.If we keep the audio portion of the podcast on the old website and move the show notes and the audio portion of the podcast to the new website, is there any issues with duplicate content?Long-term, I presume Google will re-index the old and the new pages, thus no duplicate content, but I want to make sure I'm not missing anything. I was planning to fetch pages in Search Console as we migrate content.Thanks for your help!
Technical SEO | | JimmyFritz0 -
Move from 4 Domains to 1
Hey Moz Community We are running 4 domains at the moment. www.rapturecamps.com www.surfcamp.travel www.surfcampbali.com www.surfcampinportugal.com We started of our business with 1. after a view years in business we got the option to buy the other 3 domains which have ranked quite well with certain keywords. As its quite allot of work maintaining all these websites with two languages, we where thinking of actually moving number 2, 3 ,4 all to number 1. All domains receive still some good rankings as well as daily hits. So we kinda like would like to keep the SEO Juice. Therefore we where researching for some time what would be the best practice todo so. For us there are two possible options We go trough all posts/pages on the domain 2,3,4 and copy the content over to domain 1. After thats done we create 301 redirects on the domains 2,3,4 linking them back to domain 1 posts/pages. We do do so by manually adding the 301's into the htaccess file, so we are able to delete the Wordpress installations. Our we just copy the Pages/Posts from the domains 2,3,4 to the domain 1 and then kill the 2,3,4 domains afterwords, and let google index these Pages/Posts on the new domain. This way we think we would loose the whole SEO Juice from the old domains. The reason we are asking this one here, we have been reading that this method could lead to red flags at google if we redirect to much Pages/Post back to Domain 1. Hopefully someone here can help us answer that question.
Technical SEO | | 5Gates0 -
Moving to https
Hi all and thanks for taking the time to read my question. We are going to migrate a very small website from http to https, its a roughly 9 page site with 5 of those being product pages. I figured I would have to set a canonical and permanent 301 redirects for each page. But our tech guys suggested just doing a binding to https so any traffic hitting our site with a http url would automatically get redirected to the https version. So if someone land on http://mydomain, it would automatically return https://mydomain Does this sound correct or would we need to do additional tasks even if we go down the binding route?thanks again for looking.
Technical SEO | | Renford_Nelson1 -
Should I Use the Disavow Tool to for a Spammy Site/Landing Page?
Here's the situation... There's a site that is linking to about 6 articles of mine from about 225 pages of theirs (according to info in GWT). These pages are sales landing pages looking to sell their product. The pages are pretty much identical but have different urls. (I actually have a few sites doing this to me.) Here's where I think it's real bad -- when they are linking to me you don't see the link on the page, you have to view the page source and search for my site's url. I'm thinking having a hidden url, and it being my article that's hidden, has got to be bad. That on top of it being a sales page for a product I've seen traffic to my site dropping but I don't have a warning in GWT. These aren't links that I've placed or asked for in any way. I don't see how they could be good for me and I've already done what I could to email the site to remove the links (I didn't think it would work but thought I'd at least try). I totally understand that the site linking to me may not have any affect on my current traffic. So should I use the Disavow tool to make sure this site isn't counting against me?
Technical SEO | | GlenCraig0 -
Do i have my robots.txt file set up properly
Hi, just doing some seo on my site and i am not sure if i have my robots file set correctly. i use joomla and my website is www.in2town.co.uk. here is my robots file, does this look correct to you User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/ many thanks1 -
Problem with indexed files before domain was purchased
Hello everybody, We bought this domain a few months back and we're trying to figure out how to get rid of indexed pages that (i assume) existed before we bought this domain - the domain was registered in 2001 and had a few owners. I attached 3 files from my webmasters tools, can anyone tell me how to get rid of those "pages" and more important: aren't this kind of "pages" result of some kind of "sabotage"? Looking forward to hearing your thoughts on this. Thank you, Alex Picture-5.png Picture-6.png Picture-7.png
Technical SEO | | pwpaneuro0 -
Moving Duplicate Sites
Apologies in advance for the complexity. My client, company A, has purchased company B in the same industry, with A and B having separate domains. Current hosting arrangement combines registrar and hosting functions in 1 account so as to allow both domains to point to a common folder, with the result that identical content is displayed for both A & B. The current site is kind of an amalgam of A and B. Company A has decided to rebrand and completely absorb company B. The problem is that link value overwhelmingly favours B over A. The current (only) hosting package is Windows, and I am creating a new site and moving them to Linux with another hosting company. I can use 301's for A , but not for B as it is a separate domain and currently shares a hosting package with A. How can I best preserve the link juice that domain B has? The only conclusion I can come up with is to set up separate Linux hosting for B which will allow for the use of 301's. Does anyone have a better idea?
Technical SEO | | waynekolenchuk0 -
.htacess file format for Apache Server
Hi, My website having canonical issue for home page, I have written the .htaccess file and upload the root directory. But still I didn't see any changes in the home page. I am copying syntax which one I have written in the .htaccess file. Please review the syntax and let me know the changes. Options +FollowSymlinks RewriteEngine on #RewriteBase / re-direct index.htm to root / ### RewriteCond %{THE_REQUEST} ^./index.htm\ HTTP/ RewriteRule ^(.)index.htm$ /$1 [R=301,L] re-direct IP address to www ### re-direct non-www to www ### re-direct any parked domain to www of main domain RewriteCond %{http_host} !^www.metricstream.com$ [nc] RewriteRule ^(.*)$ http://www.metricstream.com/$1 [r=301,nc,L] Is there any specific htaccess file format for apache server? Thanks, Karthik
Technical SEO | | karthik-1755440