Using disavow tool for 404s
-
Hey Community,
Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp).
It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question.
Feel free to ask any questions that may help you understand the issue more.
Thanks for your help,
-Reed -
Hey Doug, had another question for you. A big majority (90% of 18,000+ errors) of our 404 errors are coming from .jsp files from our old website.
Of course, it's not ideal to manually update or redirect these, but possibly write a script to automatically change them. Would it be beneficial to add this .jsp to our robots.txt file?
-
Thanks Doug, really helpful answer.
I am getting thousands of 404's but when I dive into them the majority of the 404 URLs can't be found in any of the "linked from" examples GWT gives me.
I think 301 redirects are the best option like you said and/or having a good 404 page.
Thanks,
-Reed -
The disavow tool isn't going to "fix" these 404s.
404's aren't always a bad thing. The warnings in GWT are just there to make you aware that there's potentially a problem with your site. It doesn't mean there IS a problem.
Is there content on your site that visitors clicking on these links should be arriving at? In which case you want to implement 301 redirects so that your visitors arrive on the most appropriate pate.
If there's nothing relevant on the site any more - a 404 error is perfectly acceptable.
Of course, you want to make sure that your 404 page gives the visitors the best chance/incentive to dig into the content on your site. Adding a nice obvious search box and/or links to most popular content may be a good idea. If you're getting lots of visitors from a particular site that you can maybe tailor your 404 message depending on the referrer.
The drawback here is that links pointing at 404 error pages won't pass link-equity. If there is value in the links, and you're happy that they're going to be seen a natural/authentic as far as google is concerned then you can always 301 redirect these.
Where you really should pay attention is where you have internal links on your site that are reporting 404s. These are under your control and you really don't want to give you visitors a poor experience with lots of broken links on your site.
-
I wouldn't recommend using the disavow tool for this. The disavow tool is used to clean up spammy links that were not gained naturally.
A better solution is to use 301 redirects and redirect the 404'd pages to the new pages that work on your website. That way users will land where they should if they click the links, and Google will still give you juice from those links.
Here's a place to get started on how t do that: https://support.google.com/webmasters/answer/93633?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Structural data in google webmaster tools
Hey, During the year I have done everything in my power to please Google with my website. Instead of building links towards the page I have focused on content, content and content. In addition I have worked with https and page speed. Today my site is faster than 98% of all tested sites in Pingdom tools and have 94/83 in Google insights. Of course we have had to build some links as well, perhaps 50 links in 8 months. At the same time we have built 700 pages of text. The total amount of links build is 180 over 20 months. On Thursday last week it looks like the site was penalized by Google. I still believe that we can do something about it and get the site back on track again. Hence we have been looking at technical things on the site, if there is anything Google don't like. One thing that I have found is structural data. For some reason this has dropped from 875 a month ago to 3 today. I have no clue why. Does anyone know how structural data works and what can have caused this problem. Would it be possible that we in our attempt to optimize the site might have done something that may affect the structural data? http://imgur.com/a/vurB1 In that case, what affect might this drop in structural data mean for SEO. Could that be a reason for the total drop in ranking? (we have basically been wiped on all our keywords) What I can see in Google webmaster tool about 975 pages are still indexed in Google which has been stable for a long time. Does anyone know more about structural data and what I can do about this?
Intermediate & Advanced SEO | | Enigma123
Thanks in advance! /A vurB10 -
Magento Store Using Z-Blocks - Impact on SEO?
Hi Guys, I have a question relating to Z-Blocks in Magento. Our Magento store uses a lot of Z-Blocks, these are bits of content that are switched off and on depending on a customer’s user group. This allows us to target different offers and content to new customers (not logged in) and existing customers (logged in). Does anyone have any experience in how this impacts SEO? Thanks in advance!
Intermediate & Advanced SEO | | CarlWint0 -
Best Tool For Finding Related Keywords?
What is the best tool for finding related keywords to the primary keyword we are targetting? Cheers
Intermediate & Advanced SEO | | webguru20140 -
Using Meta Header vs Robots.txt
Hey Mozzers, I am working on a site that has search-friendly parameters for their faceted navigation, however this makes it difficult to identify the parameters in a robots.txt file. I know that using the robots.txt file is highly recommended and powerful, but I am not sure how to do this when facets are using common words such as sizes. For example, a filtered url may look like www.website.com/category/brand/small.html Brand and size are both facets. Brand is a great filter, and size is very relevant for shoppers, but many products include "small" in the url, so it is tough to isolate that filter in the robots.txt. (I hope that makes sense). I am able to identify problematic pages and edit the Meta Head so I can add on any page that is causing these duplicate issues. My question is, is this a good idea? I want bots to crawl the facets, but indexing all of the facets causes duplicate issues. Thoughts?
Intermediate & Advanced SEO | | evan890 -
Anyone Have a Tool or Method to Track Successful Link Removals?
Hello All, I am undertaking the daunting task of a link removal campaign. I've got a pretty good plan for my work flow in terms of doing the backlink research, gathering contact information, and sending the email requests. Where I'm a bit stuck is in regards to tracking the links that actually get removed. Obviously if someone replies to my email telling me they removed it, then that makes it pretty clear. However, there may be cases where someone removes the link, but does not respond. I know Moz has a ton of link tools (which I'm still getting familiar with). Is there a report or something I can generate that would show me links that did exist previously but have now been removed? If Moz cannot do it, does anyone have a recommendation on another tool that can track links to inform me whether or not they have been removed. Thanks!
Intermediate & Advanced SEO | | Lukin0 -
Links from random sites: Disavow?
I am looking at the links to my site from GWT. I see a bunch of random sites I've never heard of. I never made an effort to get links from these sites. Sites like | http://www.xlx.pl | Also found one porn site! Should I just ignore these or disavow them?
Intermediate & Advanced SEO | | inhouseseo0 -
How does the use of Dynamic meta tags effect SEO?
I'm evaluating a new client site which was built buy another design firm. My question is they are dynamically creating meta tags and I'm concerned that it is hurting their SEO. When I view the page source this is what I see. <meta name="<a class="attribute-value">keywords</a>" id="<a class="attribute-value">keywordsGoHere</a>" content="" /> <meta name="<a class="attribute-value">description</a>" id="<a class="attribute-value">descriptionGoesHere</a>" content="" /> <title id="<a class="attribute-value">titleGoesHere</a>">title> To me it looks like the tags are not being added to the page, however the title is showing when you view it in a browser and if use a spider view tool, it sees the title. I'm guess it is being called from a DB. So I'm a little concerned though that the search engines are not really seeing the title and description. I'm not worried about the keywords tag. Can anyone shed some light on how this might work? Why it might not being showing the text for the description in the page code and if that will hurt SEO? Thanks for the help!
Intermediate & Advanced SEO | | BbeS0 -
Should I be using rel canonical here?
I am reorganizing the data on my informational site in a drilldown menu. So, here's an example. One the home page are several different items. Let's say you clicked on "Back Problems". Then, you would get a menu that says: Disc problems, Pain relief, paralysis issues, see all back articles. Each of those pages will have a list of articles that suit. Some articles will appear on more than one page. Should I be worried about these pages being partially duplicates of each other? Should I use rel-canonical to make the root page for each section the one that is indexed. I'm thinking no, because I think it would be good to have all of these pages indexed. But then, that's why I'm asking!
Intermediate & Advanced SEO | | MarieHaynes0