Using disavow tool for 404s
-
Hey Community,
Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp).
It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question.
Feel free to ask any questions that may help you understand the issue more.
Thanks for your help,
-Reed -
Hey Doug, had another question for you. A big majority (90% of 18,000+ errors) of our 404 errors are coming from .jsp files from our old website.
Of course, it's not ideal to manually update or redirect these, but possibly write a script to automatically change them. Would it be beneficial to add this .jsp to our robots.txt file?
-
Thanks Doug, really helpful answer.
I am getting thousands of 404's but when I dive into them the majority of the 404 URLs can't be found in any of the "linked from" examples GWT gives me.
I think 301 redirects are the best option like you said and/or having a good 404 page.
Thanks,
-Reed -
The disavow tool isn't going to "fix" these 404s.
404's aren't always a bad thing. The warnings in GWT are just there to make you aware that there's potentially a problem with your site. It doesn't mean there IS a problem.
Is there content on your site that visitors clicking on these links should be arriving at? In which case you want to implement 301 redirects so that your visitors arrive on the most appropriate pate.
If there's nothing relevant on the site any more - a 404 error is perfectly acceptable.
Of course, you want to make sure that your 404 page gives the visitors the best chance/incentive to dig into the content on your site. Adding a nice obvious search box and/or links to most popular content may be a good idea. If you're getting lots of visitors from a particular site that you can maybe tailor your 404 message depending on the referrer.
The drawback here is that links pointing at 404 error pages won't pass link-equity. If there is value in the links, and you're happy that they're going to be seen a natural/authentic as far as google is concerned then you can always 301 redirect these.
Where you really should pay attention is where you have internal links on your site that are reporting 404s. These are under your control and you really don't want to give you visitors a poor experience with lots of broken links on your site.
-
I wouldn't recommend using the disavow tool for this. The disavow tool is used to clean up spammy links that were not gained naturally.
A better solution is to use 301 redirects and redirect the 404'd pages to the new pages that work on your website. That way users will land where they should if they click the links, and Google will still give you juice from those links.
Here's a place to get started on how t do that: https://support.google.com/webmasters/answer/93633?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wrong redirect used
Hi Folks,
Intermediate & Advanced SEO | | Patrick_556
I have a query & looking for some opinions. Our site migrated to https://
Somewhere along the line between the developer & hosting provided 302 redirect was implemented instead of the recommended 301 (the 301 rule was not being honured in the htaccess file.)
1 week passed, I noticed some of our key phrases disappear from the serps 😞 When investigated, I noticed this the incorrect redirect was implemented. The correct 301 redirect has now been implemented & functioning correctly. I have created a new https property in webmaster tools, Submitted the sitemap, Provided link in the robots.txt file to the https sitemap Canonical tags set to correct https. My gut feeling is that Google will take some time to realise the problem & take some time to update the search results we lost. Has anyone experienced this before or have any further thoughts on how to rectify asap.0 -
How and When Should I use Canonical Url Tags?
Pretty new to the SEO universe. But I have not used any canonical tags, just because there is not definitive source explaining exactly when and why you should use them??? Am I the only one who feels this way?
Intermediate & Advanced SEO | | greenrushdaily0 -
Permanently using 301 for internal link
Hello Folks, Tried going through the 301 answers but could not find any question similar to what I had. The issue we have is we have got a listing page with the products like this: /used-peugeot/used-toyota-corolla As you can see this URL is not really ideal and I want to redirect it to /used-toyota/corolla using mod_rewrite. The redirect will be 301. My concern here is the URL in the listing page won't change to /used-toyota/corolla and hence the 301 will be 'permanently' placed and I was wondering if this will lose some link juice of the 301ed URL. Now with 301 being a 'permanent' redirect one would assume it should not be an issue but I just wanted to be sure that I am correct in assuming so. Thank you for your time.
Intermediate & Advanced SEO | | nirpan0 -
Bing Webmaster Tools failed to reach sitemap any suggestions?
My sitemap has been submitted to Bing webmaster tools well over a year ago and I have never had any problems. Starting last week it showed failed, for some reason it can't reach it. I have resubmitted several times and it fails every time. I can go to the url with no problems, and Google Webmaster Tools does not have any problems. We have made no changes in over a year to how the sitemap is made and submitted. Anyone have any ideas?
Intermediate & Advanced SEO | | EcommerceSite0 -
My indexed pages count is shrinking in webmaster tools. Is this normal ?
I noticed that our total # of indexed pages dropped recently by a substantial amount (see chart below) Is this normal? http://imgur.com/4GWzkph Also, 3 weeks after this started dropping, we got a message on increased # of crawl errors and found that a site update was causing 300+ new 404s. could this be related ?
Intermediate & Advanced SEO | | znotes0 -
WordPress redesign: using posts as pages?
Starting a redesign for an attorney who is currently using WordPress with an old framework that is no longer being supported, so I'm going to install a new WP and start from scratch. The site consists of about 30 static pages (practice areas, attorney profiles, etc.) and they write about 5 blog posts per month. I've always differentiated between posts and pages for WP sites I've done in the past, but this time around I thought it might be more clean (less files, and easier for their webmaster to make routine edits) if I just brought over the static pages as posts. However, the recent webinar on the Yoast SEO plugin mentioned using the month/day in the permalink structure for posts to avoid duplicate content issues. That would go against how I was thinking of setting it up, because I would have just generated the URL off the page title and make a separate category for "pages". Just wondering if anyone's used posts as pages before. While this seems like it would make things easier for the webmaster, I'm not sure it maximizes potential for SEO. Thanks.
Intermediate & Advanced SEO | | c2g0 -
Why is Google Webmaster Tools reporting a massive increase in 404s?
Several weeks back, we launched a new website, replacing a legacy system moving it to a new server. With the site transition, webroke some of the old URLs, but it didn't seem to be too much concern. We blocked ones I knew should be blocked in robots.txt, 301 redirected as much duplicate data and used canonical tags as far as I could (which is still an ongoing process), and simply returned 404 for any others that should have never really been there. For the last months, I've been monitoring the 404s Google reports in Web Master Tootls (WMT) and while we had a few hundred due to the gradual removal duplicate data, I wasn't too concerned. I've been generating updated sitemaps for Google multiple times a week with any updated URLs. Then WMT started to report a massive increase in 404s, somewhere around 25,000 404s per day (making it impossible for me to keep up). The sitemap.xml has new URL only but it seems that Google still uses the old sitemap from before the launch. The reported sources of 404s (in WMT) don't exist anylonger. They all are coming from the old site. I attached a screenshot showing the drastic increase in 404s. What could possibly cause this problem? wmt-massive-404s.png
Intermediate & Advanced SEO | | sonetseo0