Transitioning to HTTPS, Do i have to submit another disavow file?
-
Hi Mozzers!
So we're finally ready to move from http to https. Penguin is finally showing us some love due to the recent algorithm updates. I just added the https to Google search console and before 301 redirecting the whole site to a secure environment.....do i upload the same disavow file to the https version?
Moreover, is it best to have both the http and https versions on Google Search console? Or should I add the disavow file to the https version and delete the http version in a month?
And what about Bing?
Help.
-
If Bing will allow you to do it then yes. I can't add "other" version for my websites so probably Bing covers "all".
-
What about for Bing? Do i create 4 variations?
-
so ok, just for disavows.
-
yes.
-
Just for adding disavow. Because you have redirects set well right?
-
Krzysztof,
Does that mean I have to add sitemaps to all 4 versions as well? Including data highlighter schema? Or the purpose of these 4 it to redundant disavow files?
Thank you
-
Hi Shawn
Yes. Make sure you have all properties in your Google Search Console: www.domain.com, domain.com, https://domain.com, https://www.domain.com even if you have redirects. If you don't, create them, verify, and upload disavow file to all 4 "versions". If you got some subdomains, do the same for them too.
Don't delete. Add to all 4.
Krzysztof
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I redo my submitted sitemap to Google?
We are a electronic hardware manufacture with a fairly large catalog of products. I dynamically built our site and we have over 705,000 unique products that we can offer. With our php framework I was able to create sitemaps that hold every product unique url. After doing all of that I submitted our data to Google. Then waited with a cocktail encouraged that we'd grow up the ranks of Google organically. Well, that didn't happen. Besides several other problems (lack of overall unique content, appearance of duplicate content, no meta description, no unique page titles, poor use of heading tags and no rel canonical tags) how can I get a "do-over" with Google and my submitted sitemaps? Can they be re-submitted? Can they even be deleted?
Reporting & Analytics | | jandk40140 -
Submitted URL marked 'noindex'
Search Console is giving this issue for near 100 pages of my website. I have checked the Yoast plugin settings. We haven't used any meta robots tag for these pages, neither have these pages been disallowed in robots.txt. Previosuly this issue was for some 20+ pages. I tried to reindex them by submitting the urls again. Now the count has risen to 100+. There is also this "Submitted URL blocked by robots.txt" issue for pages which are NOT disallowed in robots.txt. Can anyone please suggest me a solution here?
Reporting & Analytics | | Reema240 -
Longevity of robot.txt files on Google rankings
This may be a difficult question to answer without a ton more information, but I'm curious if there's any general thought that could shed some light on the following scenario I've recently heard about and wish to be able to offer some sound advice: An extremely reputable non-profit site with excellent ranking had gone through a re-design and change-over into WordPress. A robots.txt file was used during development on the dev site on the dev server. Two months later it was noticed through GA that traffic was way down to the site. It was then discovered that the robot.txt file hadn't been removed and the new site (same content, same nav) went live with it in place. It was removed and a site index forced. How long might it take for the site to re-appear and regain past standing in the SERPs if rankings have been damaged. What would the expected recovery time be?
Reporting & Analytics | | gfiedel0 -
Lost rankings after disavowing links
About two months ago, I received an unnatural inbound links message from Google. Then I disavowed 58 (the worst ones) and now I can see that right after the date I submitted my disavow file I'm losing rankings. What would you suggest? I don't really want to revoke my disavow file because it has totally bad links. I have this idea to build 58 links from high quality sites (instead of the 58 I disavowed). Do you think it'll work faster (if at all) or I just need to remove my disavow file?
Reporting & Analytics | | VinceWicks0 -
Disavowed inks
If we disavow spammy links to our site in google, will the overall number of incoming links reported in webmaster tools reflect the drop? (assuming google indeed disavowed them).
Reporting & Analytics | | joebella0 -
Google Analytics - Tracking a Goal from 1 Domain to Another
Hi there, been combing this page for some answers - https://developers.google.com/analytics/devguides/collection/gajs/gaTrackingSite but can't seem to figure it out. Scenario - Site A Site B Currently have 2 different GA accounts set up on each. Is it possible to Track a Goal that happens on Site B, that comes from Site A - In Site A Reports? So say, Site A has a Newsletter, directing them to Site A but promoting Site B - i'd like to track the effectiveness of the newsletter. Site A is running a white label of Site B service. Thoughts? Cheers
Reporting & Analytics | | IsHot0 -
Getting traffic for another site
Hi Everyone, Our website url/brand is very close to another website url/brand. We are non-competing entities. It appears as though this other company has begun a marketing program which has resulted in our traffic skyrocketing. However, it seems to have also resulted in our Pages/Visit and Visit Duration to decrease and our Bounce Rate to increase. Can anyone suggest how to deal with this type of scenario? Thanks,
Reporting & Analytics | | AC_Pro
Robert0 -
Adding Something to htaccess File
When I did a google search for site.kisswedding.com (my website) I noticed that google is indexing all of the https versions of my site. First of all, I don't get it because I don't have an SSL certificate. Then, last night I did what my host (bluehost) told me to do. I added the below to my htaccess file. Below rule because google is indexing https version of site - https://my.bluehost.com/cgi/help/758RewriteEngine OnRewriteCond %{HTTP_HOST} ^kisswedding.com$ [OR]RewriteCond %{HTTP_HOST} ^kisswedding.com$RewriteCond %{SERVER_PORT} ^443$RewriteRule ^(.*)$ http://www.kisswedding.com [R=301,L] Tonight I when I did a google search for site:kisswedding.com all of those https pages were being redirected to my home page - not the actually page they're supposed to be redirecting to. I went back to Bluehost and they said and 301 redirect shouldn't work because I don't have an SSL certificate. BUT, I figure since it's sorta working I just need to add something to that htaccess rule to make sure it's redirected to the right page. Someone in the google webmaster tools forums told me to do below but I don't really get it? _"to 301 redirect from /~kisswedd/ to the proper root folder you can put this in the root folder .htaccess file as well:_Redirect 301 /~kisswedd/ http://www.kisswedding.com/" Any help/advice would be HUGELY appreciated. I'm a bit at a loss.
Reporting & Analytics | | annasus0