Disavow cache
-
Hey everyone,
Currently helping a website that has been penalised and we've been going down a heavy link removal process as it has a pretty bad link profile.
Our first disavow request has been rejected, and I was wondering....
When submitting a reconsideration request, do Google only know when a link has been removed when it's cached? If so, should I leave it a while for a reconsideration request as it might take a while for the cache to be updated
Thanks
-
We're seeing some mixed messages from Google in this regard. They have suggested recently that links do have to be re-crawled to be disavowed. For an algorithmic penalty, that should be enough. For a manual penalty, even if the links are disavowed and re-crawled, Google still may require a reconsideration request. So, it can be a bit tricky.
For reconsideration, someone is looking at the request, so we assume they take the disavow into account and try to adjust for that, but it may depend on the person and the situation. It's important that you thoroughly document what you've done in the reconsideration request. Don't just disavow a ton of stuff and then say "Hey, can we get back in?" They won't take that seriously, in most cases.
-
Sorry, I probably explained it wrong.
I meant during the process of proving your efforts of removing links to Google, will you need the websites to be cached for Google to see that the links have been removed?
-
Hey there
The way the disavow tool works is that, once the file has been processed (which is pretty quickly), Google simply ignores those links - effectively giving them a nofollow attribute.
That means that the webmaster tool link report will always have the link still in there if it exists, even if it has been disavowed.
The disavow process is simply dependent on the file uploaded to be in the right format and for it to be processed. Once that's done, it will be in effect straight away, it does not require a recrawl or cache.
It's worth pointing out that if you upload a new file and a link you once had in the file is now longer there, it will be considered again.
Once your disavow file has been uploaded, probably a good idea to wait 2-3 days to make sure it has been fully processed. But once that's done, your reconsideration request will take into account the file - particularly as the reconsideration request is a manual review. If they can see the file processed, it will take into account that the links have been disavowed.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not showing the recent cache info: How to know the last cached version of a page?
Hi, We couldn't able to see the last Google cached version of our homepage after March 29th. Just wondering why this is happening with other websites too. When we make some changes to the website, we will wait to our website indexed and cached, so the changes will have some ranking impact. Now we couldn't able to check if the website got indexed with changes. Is there any other way to check the latest cached version or time of last index? Thanks
Algorithm Updates | | vtmoz0 -
Does Google considers the cached content of a page if it's redirected to new page?
Hi all, If we redirect an old page to some new page, we know that content relevancy between source page and this new page matters at Google. I just wonder if Google is looking at the content relevancy of old page (from cache) and new page too. Thanks
Algorithm Updates | | vtmoz0 -
38% of SEOs Never Disavow Links: Are you one among them or the other 62%?
Hi all, Links disavowing is such a advanced tasks in SEO with decent amount of risk involved. I thought many wouldn't follow use this method as Google been saying that they try to ignore bad links and there will be no penalty for such bad links and negative SEO is really a rare case. But I wondered to see only 38% SEOs never used this method and other 62% are disavowing links monthly, quarterly or yearly. I just wonder do we need to disavow links now? It's very easy to say to disavow a link which is not good but difficult to conclude them whether they are hurting already or we will get hurt once they been disavowed. Thanks Screenshot_3.jpg
Algorithm Updates | | vtmoz1 -
Is it bad from an SEO perspective that cached AMP pages are hosted on domains other than the original publisher's?
Hello Moz, I am thinking about starting to utilize AMP for some of my website. I've been researching this AMP situation for the better part of a year and I am still unclear on a few things. What I am primarily concerned with in terms of AMP and SEO is whether or not the original publisher gets credit for the traffic to a cached AMP page that is hosted elsewhere. I can see the possible issues with this from an SEO perspective and I am pretty sure I have read about how SEOs are unhappy about this particular aspect of AMP in other places. On the AMP project FAQ page you can find this, but there is very little explanation: "Do publishers receive credit for the traffic from a measurement perspective?
Algorithm Updates | | Brian_Dowd
Yes, an AMP file is the same as the rest of your site – this space is the publisher’s canvas." So, let's say you have an AMP page on your website example.com:
example.com/amp_document.html And a cached copy is served with a URL format similar to this: https://google.com/amp/example.com/amp_document.html Then how does the original publisher get the credit for the traffic? Is it because there is a canonical tag from the AMP version to the original HTML version? Also, while I am at it, how does an AMP page actually get into Google's AMP Cache (or any other cache)? Does Google crawl the original HTML page, find the AMP version and then just decide to cache it from there? Are there any other issues with this that I should be aware of? Thanks0 -
How I can check if Google and other search engines will properly cache a page (a dynamic one)?
My site is currently disallowing search engine bots with the help of robots.txt. These dynamic pages can be crawled using Screamingfrog since they are linked to a static category page which is also linked to the homepage. Thanks in advance!
Algorithm Updates | | esiow20130 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
How can I check Googles Page Cache ?
Hi I use to have a handy tool in Firefox (Google Toolbar) that was very handy for checking page ranks and what date a page had been cached. For a while with the newer versions of Firefox I cannot seem to locate this useful tool, Can anybody recommend any useful tools for checking the above. Thanks Adam
Algorithm Updates | | AMG1000 -
Frequency & Percentage of Content Change to get Google to Cache Every Day?
What is the frequency at which your homepage (for example) would have to update and what percentage of the page's content would need to be updated to get cached every day? What are your opinions on other factors.
Algorithm Updates | | bozzie3110