Using disavow tool for 404s
-
Hey Community,
Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp).
It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question.
Feel free to ask any questions that may help you understand the issue more.
Thanks for your help,
-Reed -
Hey Doug, had another question for you. A big majority (90% of 18,000+ errors) of our 404 errors are coming from .jsp files from our old website.
Of course, it's not ideal to manually update or redirect these, but possibly write a script to automatically change them. Would it be beneficial to add this .jsp to our robots.txt file?
-
Thanks Doug, really helpful answer.
I am getting thousands of 404's but when I dive into them the majority of the 404 URLs can't be found in any of the "linked from" examples GWT gives me.
I think 301 redirects are the best option like you said and/or having a good 404 page.
Thanks,
-Reed -
The disavow tool isn't going to "fix" these 404s.
404's aren't always a bad thing. The warnings in GWT are just there to make you aware that there's potentially a problem with your site. It doesn't mean there IS a problem.
Is there content on your site that visitors clicking on these links should be arriving at? In which case you want to implement 301 redirects so that your visitors arrive on the most appropriate pate.
If there's nothing relevant on the site any more - a 404 error is perfectly acceptable.
Of course, you want to make sure that your 404 page gives the visitors the best chance/incentive to dig into the content on your site. Adding a nice obvious search box and/or links to most popular content may be a good idea. If you're getting lots of visitors from a particular site that you can maybe tailor your 404 message depending on the referrer.
The drawback here is that links pointing at 404 error pages won't pass link-equity. If there is value in the links, and you're happy that they're going to be seen a natural/authentic as far as google is concerned then you can always 301 redirect these.
Where you really should pay attention is where you have internal links on your site that are reporting 404s. These are under your control and you really don't want to give you visitors a poor experience with lots of broken links on your site.
-
I wouldn't recommend using the disavow tool for this. The disavow tool is used to clean up spammy links that were not gained naturally.
A better solution is to use 301 redirects and redirect the 404'd pages to the new pages that work on your website. That way users will land where they should if they click the links, and Google will still give you juice from those links.
Here's a place to get started on how t do that: https://support.google.com/webmasters/answer/93633?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I redirect 404s or should I eliminate them?
Hello! I am now checking a website that has been migrated months ago from osCommerce to Prestashop.
Intermediate & Advanced SEO | | teconsite
While I was checking crawl errors in search console I found a lot of 404s coming from the last website. The urls are mainly 4 types: popup_image.php?pID=125&osCsid=507c27261ba5ca2568f06ce5bad2ebc9 product-friendly-url-pr-125%3FosCsid.... product-friendly-url-p-125%3FosCsid..... products_new.php?page=228 I've have realized that the parameter pId, and the number that comes after pr- and p- is the product Id in the new website, so I think our team will be able to create an script to redirect those. My question is: Is it ok to send several urls to the same url?. I mean, the popup_image.php was not the product page, as its name says it's more like a popup page. We don't have now a pop up page for images, so I was thinking to send that url to the product page. the one with the pr- was product review page the one with the p- was the product page I was thinking on redirecting the 3 of them to the product page? Should I? Or should I just redirect the last one (p-) and eliminate the others from the index? And... the ones with products_new.php?page=228 I was thinking to redirect all to the page 1 of new products. Is it ok? thank you!0 -
Joomla to Wordpress site migration - thousands of 404s
I recently migrated a site from Joomla to Wordpress. In advance I exported the HTML pages from Joomla using Screaming Frog and did 301 redirects on all those pages. However Webmaster Tools is now telling me (a week after putting the redirects in place) that there are >7k 404s. Many of them aren't HTML pages, just index.php files but I didn't think I would have to export these in my Screaming Frog crawl. We have since done a blanket 301 redirect for anything with index.php in it but Webmaster Tools is still picking them up as 404s. So my question is, what should I have done with Screaming Frog re exporting to ensure I captured all pages to redirect and what should I now do to fix the 404s that Webmaster Tools is picking up?
Intermediate & Advanced SEO | | Bua0 -
Some Tools Not Recognizing Meta Tags
I am analyzing a site which has several thousands of pages, checking the headers, meta tags, and other on page factors. I noticed that the spider tool on SEO Book (http://tools.seobook.com/general/spider-test) does not seem to recognize the meta tags for various pages. However, using other tools including Moz, it seems the meta tags are being recognized. I wouldn't be as concerned with why a tool is not picking up the tags. But, the site suffered a large traffic loss and we're still trying to figure out what remaining issues need to be addressed. Also, many of those pages once ranked in Google and now cannot be found unless you do a site:// search. Is it possible that there is something blocking where various tools or crawlers can easily read them, but other tools cannot. This would seem very strange to me, but the above is what I've witnessed recently. Your suggestions and feedback are appreciated, especially as this site continues to battle Panda.
Intermediate & Advanced SEO | | ABK7170 -
Alternative Link Detox tools?
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
Intermediate & Advanced SEO | | lightwurx0 -
Disavow- What Happens and What Should I Do?
We have a site that got hit by a non-manual penalty in July really hard. I submitted a disavow file for the site placeyourlinks.com which had a bunch of clearly spammy links to the site listed in Webmaster tools. But the site itself was down for a long time so I couldn't see where the links even were. Then those links disappeared from the links file. I thought the urls were removed or the site was seen as being blank. But now they're back...and the site itself is shown as just being a blank page. I don't know what to do since I don't want to disavow those links again if it wasn't even addressed the first time and there is obviously no way to contact the site. Help! Also, I've done a bunch of work on the site to increase the amount of content while I was waiting to see what happened with the link disavow. But now all that is done and our rankings are still waaaay down. I'm considering getting really, really aggressive with link removal and disavowing if needed but I'm not sure what I should focus on removing/disavowing. Really bad sites with only one or two links? Sites that have a lot of links to the site? Sites with keyword stuffy anchor text? Any help on this would be much appreciated.
Intermediate & Advanced SEO | | Fuel0 -
Best tools for exploring links?
and not just every single link, but ones you know that Google is actually indexing. I find seomoz to be super easy, but there is no way to distinguish links that are actually counting "juice", or am i missing something. What about majesticseo - any other similar tools you use when trying to find linking sites that pass juice?
Intermediate & Advanced SEO | | imageworks-2612900 -
Using microformatting for multiple online review source
I sat in on the Microformats & Schema.org – Real life Use Cases webcast today. Thank you for providing that helpful information. This discussion question is related to microformat in relation to online reviews: I work for a local auto dealership group. Each of our dozen+ dealers have reviews on Google, dealerrater, cars.com, edmunds, etc. Is there a SEO benefit to compiling online reviews from multiple sources mentioned above into our site in a microformat (knowing the reviews existed elsewhere) or would we have to use a third party tool such as Bazaarvoice & capture our reviews through there? Have any of you pulled reviews from various sources onto your site & used the microformatting technique? If so, what was the outcome? Thanks
Intermediate & Advanced SEO | | autoczar0 -
All page files in root? Or to use directories?
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /. For example: /aosta-valley-i6816.html
Intermediate & Advanced SEO | | Peter264
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.html We are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following: /images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.html Would we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory: /images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/ If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/ Just looking for some clarity to our problem! Thank you for your help guys!0