Weird 404 Errors in Webmaster Tools
-
Hi,
In a regular check with Webmaster Tools, I have noticed a sudden increase in the number of "not found-404" errors. So I have been looking at them and noticed something weird has been going on.
There are well over 100 pages with 404-errors. The funny thing is, none of the ULR's are correct, For example, if the actual url is something like www.domain.com/latest-reviews , the 404-error points to a non-existent URL like www.domain.com/latest-re And when I checked where they were linked from, they are all from these spammy sites.
Anyone know what could be causing these links, why would anyone link on purpose to a non-existent page?
cheers,
-
I have alike problem: dozen of 404 errors in webmastertools like this:
http://domain.ru/ka...tino-akcia-trexkomnatnaja
http://domain.ru/Sa...e-novosti-za-oktyabr-2012
And there's not linkes to these pages from anywhere. Strange situation, cause i've lot's of pages with urls of different length, but not all of theme comes with error.
-
Thanks. I have actually been adding 301 redirects but didn't want to be spending too much time on it. Some of the links were not even linked. They were just text and Google still treated them as links.
-
Thanks. I've got canonical. So I guess I don't have to do anything.
-
Hi,
When compare you give urls seems someone have posted your shortened urls. As an example on some websites they are shortening the actual url and using as Anchor text.
As an example http://www.seomoz.org/q/wei.. but it correctly has linked to the correct page. But some users with less knowledge, they just copy the Anchor text and post those at blog posts or some other places. Because that anchor text looks like an url.
And also it can be happen because of some other site's activity.
Anyway 404 not found errors will not affect your ranking. So you do not have to worry about this problem. Also suggest you to read this help document about 404 errors.
But I can see some another problem can happen because of this kind of activity. Because if you will get any traffic from a url like that with some suffixed which you have not created. As an example a url like this
www.domain.com/latest-reviews/?refferer=some_reffer
can be have a duplicate content issue. So, I strongly recommend to add rel canonical url in to your page.
Regards
Prasad
-
Google is finding text URLs on sites with limited characters. It's a google crawl problem.
SiteX refers to your article: http://yourdomain.com/blog/austin/steve-rides-to-the-alamo but they hit a charater limit of say 40 characters so they print the URL as "http://yourdomain.com/blog/austin/steve" but link it correctly. Even with a correct link, google will read the text and crawl it the way the text is printed, not linked. Or this happens if it's not linked at all and just a shortened text URL.
To sum it up... Google's got a problem and scrapper sites that chop up URLs are feeding the bots crap. If however the linking domain is a good one and you'd like to take advantage of this little error, then you create a redirect rule on your website for the 404 page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap issue - Tons of 404 errors
We've recreated a client site in a subdirectory (mysite.com/newsite) of his domain and when it was ready to go live, added code to the htaccess file in order to display the revamped website on the main url. These are the directions that were followed to do this: http://codex.wordpress.org/Giving_WordPress_Its_Own_Directory and http://codex.wordpress.org/Moving_WordPress#When_Your_Domain_Name_or_URLs_Change. This has worked perfectly except that we are now receiving a lot of 404 errors am I'm wondering if this isn't the root of our evil. This is a WordPress self-hosted website and we are actively using the WordPress SEO plugin that creates multiple folders with only 50 links in each. The sitemap_index.xml file tests well in Google Analytics but is pulling a number of links from the subdirectory folder. I'm wondering if it really is the manner in which we made the site live that is our issue or if there is another problem that I cannot see yet. What is the best way to attack this issue? Any clues? The site in question is www.atozqualityfencing.com https://wordpress.org/plugins/wordpress-seo/
Technical SEO | | JanetJ0 -
Increase 404 errors or 301 redirects?
Hi all, I'm working on an e-commerce site that sells products that may only be available for a certain period of time. Eg. A product may only be selling for 1 year and then be permanently out of stock. When a product goes out of stock, the page is removed from the site regardless of any links it may have gotten over time. I am trying to figure out the best way to handle these permanently out of stock pages. At the moment, the site is set up to return a 404 page for each of these products. There are currently 600 (and increasing) instances of this appearing on Google Webmasters. I have read that too many 404 errors may have a negative impact on your site, and so thought I might 301 redirect these URLs to a more appropriate page. However I've also read that too many 301 redirects may have a negative impact on your site. I foresee this to be an issue several years down the road when the site has thousands of expired products which will result in thousands of 404 errors or 301 redirects depending on which route I take. Which would be the better route? Is there a better solution?
Technical SEO | | Oxfordcomma0 -
Webmaster tools crawl stats
Hi I have a clients site that was having aprox 30 - 50 pages crawled regularly since site launch up until end of Jan. On the 21st Jan the crawled pages dropped significantly from this average to about 11 - 20 pages per day. This also coincided with a massive rankings drop on the 22nd which i thought was something to do with panda although it later turned out the hosts had changed the DNS and exactly a week after fixing it the rankings returned so i think that was the cause not panda. However i note that the crawl rate still hasn't returned to what it was/previous average and is still following the new average of 10-20 pages per day rather than the 30-50 pages per day. Does anyone have any ideas why this is ? I have since added a site map but hasnt increased crawl rate since A bit of further info if it helps in any way is that In the indexed status section says 48 pages ever crawled with 37 pages indexed. There are 48 pages on the site. The site map section says 37 submitted with 35 indexed. I would have thought that since dynamic site map would submit all urls Any clarity re the above much appreciated ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
What does this error mean?
We recently merged our Google + & Google Local pages and sent a request to Webmaster tools to connect the Google + page to our website. The message was successfully sent. However, when clicking the 'Approve or reject this request' link, the following error message appears: 'Can't find associate request' Anyone know what we are doing incorrectly? Thanks in advance for any insight.
Technical SEO | | SEOSponge0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Suggested crawl rate in google webmaster tools?
hey moz peeps, got a general question: what is the suggested custom crawl rate in google webmaster tools? or is it better to "Let Google determine my crawl rate (recommended)" If you guys have any good suggestions on this and site why that would be very helpful, thanks guys!
Technical SEO | | david3050 -
Tracking Links Tool
I think someone may be trying to harm my site by adding spammy links so I want to track the links going to my site on a daily basis. Any tool suggestions? Majestic SEO is great for getting an overall picture of my links, but is not updated daily. Thanks!
Technical SEO | | theLotter0 -
Crawl Errors In Webmaster Tools
Hi Guys, Searched the web in an answer to the importance of crawl errors in Webmaster tools but keep coming up with different answers. I have been working on a clients site for the last two months and (just completed one months of link bulding), however seems I have inherited issues I wasn't aware of from the previous guy that did the site. The site is currently at page 6 for the keyphrase 'boiler spares' with a keyword rich domain and a good onpage plan. Over the last couple of weeks he has been as high as page 4, only to be pushed back to page 8 and now settled at page 6. The only issue I can seem to find with the site in webmaster tools is crawl errors here are the stats:- In sitemaps : 123 Not Found : 2,079 Restricted by robots.txt 1 Unreachable: 2 I have read that ecommerce sites can often give off false negatives in terms of crawl errors from Google, however, these not found crawl errors are being linked from pages within the site. How have others solved the issue of crawl errors on ecommerce sites? could this be the reason for the bouncing round in the rankings or is it just a competitive niche and I need to be patient? Kind Regards Neil
Technical SEO | | optimiz10