Disavow File and SSL Conversion Question
-
Moz Community,
So we have a website that we are moving to SSL. It has been 4 years since we submitted our disavow file to google via GWT. We decided to go through our backlinks and realized that many domains we are disavowing currently (under
Since we are moving to SSL I understand Google looks at this as a new site. Therefore, we decided to go through our backlinks and realized that many domains we are disavowing currently are no longer active (after 4 years this is expected).
Therefore, is it ok to create a new disavow file with the new profile on GW (ssl version of our site)? Also, is it ok the new GW disavow file doesn't include urls we previously disavowed with the non https version?
Some links from the old disavow we found were disavowed but they shouldn't have been. Moreover, we found new links we wanted to disavow as well.
Thanks
QL
-
Hi. I think mememax gave a very good answer.
The only thing I would submit for consideration is making too many changes at one time can be hard to track later. When we did the switch to https, I was super paranoid we would screw something up and lose rankings. So I chose to leave the disavow file exactly the same. It turned out the switch was not as bad as I thought and we didn't have any noticeable effect on rankings. So later when I was convinced that the https switch was not a factor, I could modify the disavow file. I also left the old domains from years ago in there for the reasons mememax points out.
Good Luck!
-
Hi QuickLearner,
You are actually raising a very interesting point. So, as for disavow you have to disavow links pointing to the current site and the ones pointing to any other property you own which is 301ing to it to be extra safe.
Remember that the disavow file should include all URLs/Domains that are pointing to your site that you are not able to remove by yourself or after trying to ping the webmaster. Based on this:
- you should disavow in your http site all the links that are pointing to the HTTP site only that you marked as spammy
- since you're going to make many changes on the disavow file, it may be a good moment to further reanalyze links you want to include vs you want to remove. Just ensure you're doing it right.
- the HTTPS site disavow file should contain all the links of the HTTP site + the ones pointing to it. Again only the links you want to remove obviously
- Even if sites that have expired can be safely removed as they're not linking to your site anymore, in the past I always kept them. Two reasons:
- sometimes google index is not very much up to date especially with tiny, low quality sites, which these ones may be. The site may have disappeared but if google doesn't drop it, it still counts as a link to your site
- you never know what's the real reason behind that site 4XX,5XX. So in case they may reappear I would just keep it there. It's like an IP blacklist. I don't know if that IP is still used but just in case I keep it there.
I hope this helps you!
e
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved I have a "click rate juice" question would like to know.
Hello I have a "click rate juice" question would like to know. For example. I created a noindex site for a few days event purposes. Using a random domain like this: event.example.com. Expecting 5000+ clicks per day. Is it possible to gain some traffic juice from this event website domain "example.com" to my other main site "main.com" but without exposing its URL. Thought about using 301 redirecting "example.com" to "main.com". But it will reveal the example-b.com to the general public if someone visits the domain "example.com". Also thought about using a canonical URL, but it would not be working because the event site is noindex. or it would not matter at all 🤔 Wondering if there is a thing like this to gain some traffic juice for another domain? Thanks
Intermediate & Advanced SEO | | Blueli0 -
Negative SEO & How long does it take for Google to disavow
Following on from a previous problem of 2 of our main pages completely dropping from index, we have discovered that 150+ spam, porn domains have been directed at our pages (sometime in the last 3-4 months, don't have an exact date). Does anyone have exerpeince on how long it may take Google to take noticed of a new disavow list? Any estimates would be very helpful in determining our next course of action.
Intermediate & Advanced SEO | | Vuly1 -
Regex in Disavow Files?
Hi, Will Regex expressions work in a disavow file? If i include website.com/* will that work or would you recommend just website.com? Thanks.
Intermediate & Advanced SEO | | Fubra0 -
XML sitemaps questions
Hi All, My developer has asked me some questions that I do not know the answer to. We have both searched for an answer but can't find one.... So, I was hoping that the clever folk on Moz can help!!! Here is couple questions that would be nice to clarify on. What is the actual address/name of file for news xml. Can xml site maps be generated on request? Consider following scenario: spider requests http://mypage.com/sitemap.xml which permanently redirects to extensionless MVC 4 page http://mypage.com/sitemapxml/ . This page generates xml. Thank you, Amelia
Intermediate & Advanced SEO | | CommT0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
Title Attribute for H2 & h3 Tag SEO Question
Hi , We have an eCommerce Site and from looking at a competitor , they seem to be rank better than us even though our everything about our site is better . We hire the same products (are affiliated the same company) so the product list is the same. Both our category pages score A grade in Seomoz reports. So This is what I can deduce from the competitors category page I am trying to compete against. On the cement Mixer hire category page The competitor has put the Main Keyword "Cement mixer hire" in all the Title Attributes on their H2 and H3 tags as well as it being the main H1 tag. They have 5 product links in h3 tags all with the same title attribute (cement mixer hire) These links go through to product page. Am I missing a trick here ?..... I would have though using the title attribute so much for main keyword would be a bit spammy ?. I have copied them with the point 1 but I have not done mine h3 as yet. Just wondering that the SEO guru's thought ?. thanks Sarah
Intermediate & Advanced SEO | | SarahCollins0 -
Crawl questions
My first website crawl indicating many issues. I corrected the issues, requested another crawl and received the results. After viewing the excel file I have some questions. 1. There are many pages with missing Titles and Meta Descriptions in the Excel file. An example is http://www.terapvp.com/threads/help-us-decide-on-terapvp-com-logo.25/page-2 That page clearly has a meta description and title. It is a forum thread. My forum software does a solid job of always providing those tags. Why would my crawl report not show this information? This occurs on numerous pages. 2. I believe all my canonical URLs are properly set. My crawl report has 3k+ records, largely due to there being 10 records for many pages. These extra records are various sort orders and style differences for the same page i.e. ?direction=asc. My need for a crawl report is to provide actionable data so I can easily make SEO improvements to my site where necessary. These extra records don't provide any benefit. IF the crawl report determined there was not a clear canonical URL, then I could understand. But that is not the case. An example is http://www.terapvp.com/forums/news/ If you look at the source you will clearly see Where is the benefit to including the 10 other records in the Crawl report which show this same page in various sort orders? Am I missing anything? 3. My robots.txt appropriately blocks many pages that I do not wish to be crawled. What is the benefit to including these many pages in the crawl report? Perhaps I am over analyzing this report. I have read many articles on SEO, but now that I have found SEOmoz, I can see I will need to "unlearn what I have learned". Many things such as setting meta keyword tags are clearly not helpful. I wish to focus my energy and I was looking to the crawl report as my starting point. Either I am missing something, or the report design needs improvement.
Intermediate & Advanced SEO | | RyanKent0 -
Question about "launching to G" a new site with 500000 pages
Hey experts, how you doing? Hope everything is ok! I'm about to launch a new website, the code is almost done. Totally fresh new domain. The site will have like 500000 pages, fully internal optimized of course. I got my taticts to make G "travel" over my site to get things indexed. The problem is: to release it in "giant mode" or release it "thin" and increase the pages over the time? What do you recomend? Release the big G at once and let them find the 500k pages (do they think this can be a SPAM or something like that)? Or release like 1k/2k per day? Anybody know any good aproach to improve my chances of success here? Any word will be apreciated. Thanks!
Intermediate & Advanced SEO | | azaiats20