Using disavow tool for 404s
-
Hey Community,
Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp).
It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question.
Feel free to ask any questions that may help you understand the issue more.
Thanks for your help,
-Reed -
Hey Doug, had another question for you. A big majority (90% of 18,000+ errors) of our 404 errors are coming from .jsp files from our old website.
Of course, it's not ideal to manually update or redirect these, but possibly write a script to automatically change them. Would it be beneficial to add this .jsp to our robots.txt file?
-
Thanks Doug, really helpful answer.
I am getting thousands of 404's but when I dive into them the majority of the 404 URLs can't be found in any of the "linked from" examples GWT gives me.
I think 301 redirects are the best option like you said and/or having a good 404 page.
Thanks,
-Reed -
The disavow tool isn't going to "fix" these 404s.
404's aren't always a bad thing. The warnings in GWT are just there to make you aware that there's potentially a problem with your site. It doesn't mean there IS a problem.
Is there content on your site that visitors clicking on these links should be arriving at? In which case you want to implement 301 redirects so that your visitors arrive on the most appropriate pate.
If there's nothing relevant on the site any more - a 404 error is perfectly acceptable.
Of course, you want to make sure that your 404 page gives the visitors the best chance/incentive to dig into the content on your site. Adding a nice obvious search box and/or links to most popular content may be a good idea. If you're getting lots of visitors from a particular site that you can maybe tailor your 404 message depending on the referrer.
The drawback here is that links pointing at 404 error pages won't pass link-equity. If there is value in the links, and you're happy that they're going to be seen a natural/authentic as far as google is concerned then you can always 301 redirect these.
Where you really should pay attention is where you have internal links on your site that are reporting 404s. These are under your control and you really don't want to give you visitors a poor experience with lots of broken links on your site.
-
I wouldn't recommend using the disavow tool for this. The disavow tool is used to clean up spammy links that were not gained naturally.
A better solution is to use 301 redirects and redirect the 404'd pages to the new pages that work on your website. That way users will land where they should if they click the links, and Google will still give you juice from those links.
Here's a place to get started on how t do that: https://support.google.com/webmasters/answer/93633?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice to redirect all 404s?
Hey is it best practice to redirect all 404 pages. For example if the 404 pages had 0 traffic and no links why would you need to redirect that page? Isn't it best practice just to leave as a 404? Cheers.
Intermediate & Advanced SEO | | kayl870 -
Using del canonical for subpage relating to homepage
Hi, I have a subpage that has a Moz-Page Optimization score for a keyword of 98%. Nevertheless that page is not seen on page one of google. Instead google shows the homepage of the same site which has only an optimization score of around 50%. Now Moz suggests to implement a del canonical tag. Should I implement one in the subpage relating to the homepage? I have never done this and am a little unsure. What if there are other keywords for which the homepage should be shown? Will the homepage still show up? Best wishes Marc
Intermediate & Advanced SEO | | RWW0 -
Using a lot of "Read More" Hidden text
My site has a LOT of "read more" and when a user click they will see a lot of text. "read more" is dark blue bold and clear to the user. It is the perfect for the user experience, since right below I have pictures and videos which is what most users want. Question: I expect few users will click "Read more" (however, some users will appreciate chance to read and learn more) and I wonder if search engines may think I am hiding text and this is a risky approach or simply discount the text as having zero value from an SEO perspective? Or, equally important: If the text was NOT hidden with a "Read more" would the text actually carry more SEO value than if it is hidden under a "read more" even though users will NOT read the text anyway? If yes, reason may be: when the text is not hidden, search engines cannot see that users are not reading it and the text carry more weight from an SEO perspective than pages where text is hidden under a "Read more" where users rarely click "read more".
Intermediate & Advanced SEO | | khi50 -
Whole site blocked by robots in webmaster tools
My URL is: www.wheretobuybeauty.com.auThis new site has been re-crawled over last 2 weeks, and in webmaster tools index status the following is displayed:Indexed 50,000 pagesblocked by robots 69,000Search query 'site:wheretobuybeauty.com.au' returns 55,000 pagesHowever, all pages in the site do appear to be blocked and over the 2 weeks, the google search query site traffic declined from significant to zero (proving this is in fact the case ).This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster toolsrobots.txt file existed but did not have any entries to allow or disallow URLs - today I have removed robots.txt file completely URL re-direction within Linux .htaccess file - there are many rows within this complex set of re-directions. Developer has double checked this file and found that it is valid.I have read everything that google and other sources have on this topic and this does not help. Also checked webmaster crawl errors, crawl stats, malware and there is no problem there related to this issue.Is this a duplicate content issue - this is a price comparison site where approx half the products have duplicate product descriptions - duplicated because they are obtained from the suppliers through an XML data file. The suppliers have the descriptions from the files in their own sites.Help!!
Intermediate & Advanced SEO | | rrogers0 -
Google Disavow Tool - Waste of Time
My humble opinion is that Google's disavow tool.... is a utter waste of your time! My site, http://goo.gl/pdsHs was penalized over a year ago after the SEO we hired used black hat techniques to increase ranking. Ironically, while having visibility, Google itself had become a customer. (I guess the site was pretty high quality, trust worthy and user friendly enough for Google employees to purchase from.) Soon enough the message about detecting unnatural links had shown up on the webmaster tools and as expected, our rankings sank and out of view. For a year we had contacted webmasters, asking them remove links pointing back to us. 90% didn't respond, the other 10% complied). Work on our site continued, adding high quality, highly relevant unique content.
Intermediate & Advanced SEO | | Prime85
Rankings never recovered and neither did our traffic or business….. Earlier this month, we learned about Google’s "link disavow tool" and were excited! We had hoped that following the cleanup instruction, using the “link disavow tool”, we would get a chance at recovery!
We watched Matt Cutts’ video, read the various forums/blogs/topics online that were written about it, and then we felt comfortable enough to use it... We went through our backlink profile, determining which links were either spammy or seemed a result of black hat practices or the links added by a 3rd party possibly interested in our demise and added them to a .txt file. We submitted the file via the disavow tool and followed with another reconsideration request. The result came a couple of weeks later… the same cookie cutter email in the WMT suggesting that there are “unnatural links” to the site. Hope turned to disappointment and frustration. Looks like the big box companies will continue to populate the top 100 results of ANY search, the rest will help Google’s shareholders… If your site has gotten in the algorithm crosshairs, you have a better chance of recovering by changing your URL than messing around with this useless tool.0 -
How to use Video sharing sites for SEO ?
How to use Video sharing sites for SEO ? There are many video sharing sites like Youtube,vimeo,flickr ect .How to use them properly for boosting ranking of a web site using videos.as i can see only youtube and vimeo can add url of web site in description area so how to use videos on others sites .
Intermediate & Advanced SEO | | innofidelity0 -
Do you use your own Blog networks?
Do you use a network of sites you own for links to your clients in your seo efforts? I see so many seo companies doing this from such junk sites with all their clients in the blog roll, it seems totally crazy. It seems this stuff works do any of you do this if so how do you keep it white hat?
Intermediate & Advanced SEO | | DavidKonigsberg0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1