Transitioning to HTTPS, Do i have to submit another disavow file?
-
Hi Mozzers!
So we're finally ready to move from http to https. Penguin is finally showing us some love due to the recent algorithm updates. I just added the https to Google search console and before 301 redirecting the whole site to a secure environment.....do i upload the same disavow file to the https version?
Moreover, is it best to have both the http and https versions on Google Search console? Or should I add the disavow file to the https version and delete the http version in a month?
And what about Bing?
Help.
-
If Bing will allow you to do it then yes. I can't add "other" version for my websites so probably Bing covers "all".
-
What about for Bing? Do i create 4 variations?
-
so ok, just for disavows.
-
yes.
-
Just for adding disavow. Because you have redirects set well right?
-
Krzysztof,
Does that mean I have to add sitemaps to all 4 versions as well? Including data highlighter schema? Or the purpose of these 4 it to redundant disavow files?
Thank you
-
Hi Shawn
Yes. Make sure you have all properties in your Google Search Console: www.domain.com, domain.com, https://domain.com, https://www.domain.com even if you have redirects. If you don't, create them, verify, and upload disavow file to all 4 "versions". If you got some subdomains, do the same for them too.
Don't delete. Add to all 4.
Krzysztof
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitted URL marked 'noindex'
Search Console is giving this issue for near 100 pages of my website. I have checked the Yoast plugin settings. We haven't used any meta robots tag for these pages, neither have these pages been disallowed in robots.txt. Previosuly this issue was for some 20+ pages. I tried to reindex them by submitting the urls again. Now the count has risen to 100+. There is also this "Submitted URL blocked by robots.txt" issue for pages which are NOT disallowed in robots.txt. Can anyone please suggest me a solution here?
Reporting & Analytics | | Reema240 -
Redirecting one domain to another using utm tags
I have two live websites, which have both been live for over 10 years, so we have plenty of backlinks to both...domain1.com & domain2.com. Domain 1 and all urls is being merged into domain2.com. So 301 redirects will be setup for every page of the site....domain1.com/abc-1234/ to > domain2.com/abc-1234/ In Google analytics for domain2.com we want to be able to see which visits we have received as a result of a redirect from domain1.com. It is possible to see these visits that come in via organic, referrals and social etc, as those will come to us with the referral as domain1.com. However, with direct traffic, i.e. if someone types domain1.com into their search bar, these visits will be assigned as direct and we are not able to tell in GA if those users have typed in domain2.com, or domain1.com to get to our webpage. There are some suggestions in forums of adding utm_source tracking to all redirects (and add canonicals to those urls pointing to the non utm_source version), but my concern is that Google is going to have to go through one extra step to reach the page on the redirected domain. So without the utm source code Google will follow this route
Reporting & Analytics | | Sayers
domain1.com/123/ to domain2.com/123/ With the utm source code Google will follow this route
domain.com/123/ to domain2.com/123/?utm_source... then see's canonical, so moves to domain2.com/123/ So essentially I am giving Google one extra step to follow before it gets to the equivalent page on the new site. Is this an issue, and/or are there any other ways to track this redirection without adding extra parameters to the url?0 -
Internal Referral Traffic Issue due to https/http?
Hi Mozzers, we´re running a secured https account section on our website including a messaging center where lots of non secured own URLs are being shared among the users. Is there a possibility that a user clicking on one of the shared URLs within the https section triggering another session thats been counted as direct traffic? Thanks for your help! Greets
Reporting & Analytics | | LocalIM
Manson0 -
Stripping referrer on website with a mix of both http and https
I know going from https to http (usually) strips referrers but I was wondering if the referrer is stripped when your website is a mix of both http and https? Say someone browses your site (on http), adds a product and then goes to your cart (https), then decides to go back to another page on your website which is http. Will this strip the referrer? Any help on this would be great, thanks!
Reporting & Analytics | | Fitto0 -
Google Analytics Set-Up for site with both http & https pages
We have a client that migrated to https last September. The site uses canonicals pointing to the https version. The client IT team is reluctant to put 301 redirects from the non-secure to the secure and we are not sure why they object. We ran a screaming frog report and it is showing both URLs for the same page (http and https). The non-secure version has a canonical pointing to the secure version. For every secure page there is a non-secure version in ScreamingFrog so Google must be ignoring the canonical and still indexing the page however, when we run a site: we see that most URLs are the secure version. At that time we did not change the Google Analytics setup option to use: "https" instead of "http" BUT GA appears to be recording data correctly. Yesterday we set up a new profile and selected "https" but our question is: Does the GAnalytics http/https version make a difference if so, what difference is it?
Reporting & Analytics | | RosemaryB1 -
Robots.txt file issue.
Hi, Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help. Robots.txt blocked 2k important URL's of my blogging site http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350. I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool. What Can I do know to have these blocked URL's back in Google index? 1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool? 2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's. I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
Reporting & Analytics | | csfarnsworth0 -
Disavowed inks
If we disavow spammy links to our site in google, will the overall number of incoming links reported in webmaster tools reflect the drop? (assuming google indeed disavowed them).
Reporting & Analytics | | joebella0 -
Google Links Disavow - Does that preclude new links from a domain?
If using Google disavow links tool and you disavow links from a 'domain' does that mean that any 'future or new links' from that domain will be blocked? Answer Yes is good if the domain is spammy but bad if the domain was submitted in error ........ Answer NO is good if the domain was submitted in error but bad if the site is spammy. Does anyone have an answer to this please? Also is there a disavow 'undo' request process available? cheers, Mike
Reporting & Analytics | | shags380