Using disavow tool for 404s
-
Hey Community,
Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp).
It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question.
Feel free to ask any questions that may help you understand the issue more.
Thanks for your help,
-Reed -
Hey Doug, had another question for you. A big majority (90% of 18,000+ errors) of our 404 errors are coming from .jsp files from our old website.
Of course, it's not ideal to manually update or redirect these, but possibly write a script to automatically change them. Would it be beneficial to add this .jsp to our robots.txt file?
-
Thanks Doug, really helpful answer.
I am getting thousands of 404's but when I dive into them the majority of the 404 URLs can't be found in any of the "linked from" examples GWT gives me.
I think 301 redirects are the best option like you said and/or having a good 404 page.
Thanks,
-Reed -
The disavow tool isn't going to "fix" these 404s.
404's aren't always a bad thing. The warnings in GWT are just there to make you aware that there's potentially a problem with your site. It doesn't mean there IS a problem.
Is there content on your site that visitors clicking on these links should be arriving at? In which case you want to implement 301 redirects so that your visitors arrive on the most appropriate pate.
If there's nothing relevant on the site any more - a 404 error is perfectly acceptable.
Of course, you want to make sure that your 404 page gives the visitors the best chance/incentive to dig into the content on your site. Adding a nice obvious search box and/or links to most popular content may be a good idea. If you're getting lots of visitors from a particular site that you can maybe tailor your 404 message depending on the referrer.
The drawback here is that links pointing at 404 error pages won't pass link-equity. If there is value in the links, and you're happy that they're going to be seen a natural/authentic as far as google is concerned then you can always 301 redirect these.
Where you really should pay attention is where you have internal links on your site that are reporting 404s. These are under your control and you really don't want to give you visitors a poor experience with lots of broken links on your site.
-
I wouldn't recommend using the disavow tool for this. The disavow tool is used to clean up spammy links that were not gained naturally.
A better solution is to use 301 redirects and redirect the 404'd pages to the new pages that work on your website. That way users will land where they should if they click the links, and Google will still give you juice from those links.
Here's a place to get started on how t do that: https://support.google.com/webmasters/answer/93633?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disavow File
After uploading a Google disavow file how long does it take to be processed? Before any trolls get going, not been doing anything dogy, looks like someone has been trying some negative seo on us.
Intermediate & Advanced SEO | | seoman100 -
Using a pre-design template and SEO
Hi, If I use a template that maybe 50 other websites use but customise it my way will I still rank or will it hurt my ranking because other websites have the same template (even though they are in a different industry). Thanks,
Intermediate & Advanced SEO | | seoanalytics0 -
Multiply List of Keywords | Tools?
Hi guys, I was wondering does anyone know of any tools which you can had a large list of seed keywords and it will find related keywords per seed keyword. I know scrapebox, ultimate niche finder can do this, but was wondering if there was anything else in the market to checkout? Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Webmaster tools: which one do you use? Yandex Yay or Nay?
I usually verify websites on Google and Bing Webmaster. How important it is to verify on Yandex Webmaster if Russia is not one of the targeted locations?
Intermediate & Advanced SEO | | selectitaly0 -
Using two 404 NOT FOUND pages
Hi all, I was wondering if any of you can advise whether it's no issue to use two separate custom 404 pages. The 404 pages would be different for different parts of the site. For instance, if you're on /community/ and you enter a non-existing page on: www.sample.com/community/example/ it would give you a different 404 page than someone who runs into a non existing page at: www.sample.com/definition/example/ Does anybody have experience with this and would this be fine?
Intermediate & Advanced SEO | | RonFav0 -
Use "If-Modified-Since HTTP header"
I´m working on a online brazilian marketplace ( looks like etsy in US) and we have a huge amount of pages... I´ve been studing a lot about that and I was wondering to use If-Modified-Since so Googlebot could check if the pages have been updated, and if it is not, there is no reason to get a new copy of them since it already has a current copy in the index. It uses a 304 status code, "and If a search engine crawler sees a web page status code of 304 it knows that web page has not been updated and does not need to be accessed again." Someone quoted before me**Since Google spiders billions of pages, there is no real need to use their resources or mine to look at a webpage that has not changed. For very large websites, the crawling process of search engine spiders can consume lots of bandwidth and result in extra cost and Googlebot could spend more time in pages actually changed or new stuff!**However, I´ve checked Amazon, Rakuten, Etsy and few others competitors and no one use it! I´d love to know what you folks think about it 🙂
Intermediate & Advanced SEO | | SeoMartin10 -
Webmaster Tools (Urgent)
So yesterday google webmaster tools has over 5,000 links linking to my site. I get in this morniing and now i have 16 links linking to my site and no rankings minus brand terms. I do not believe that I have been penalized but I might have been. After digging further into this it seems that my www.domain.com and domain.com are separated and webmaster tools is tracking www.domain.com and majority of links are to domain.com. Is this possible or am I wishing to see something that is not there. Any help and recommendations would be absolutely appreciated.
Intermediate & Advanced SEO | | Asher0 -
Using 2 wildcards in the robots.txt file
I have a URL string which I don't want to be indexed. it includes the characters _Q1 ni the middle of the string. So in the robots.txt can I use 2 wildcards in the string to take out all of the URLs with that in it? So something like /_Q1. Will that pickup and block every URL with those characters in the string? Also, this is not directly of the root, but in a secondary directory, so .com/.../_Q1. So do I have to format the robots.txt as //_Q1* as it will be in the second folder or just using /_Q1 will pickup everything no matter what folder it is on? Thanks.
Intermediate & Advanced SEO | | seo1234560