Thousands of 404s
-
Hi there,
I'm working on a site that has a ridiculous number of 404s being returned by webmaster tools. We believe this was because there was an onpage error that was amending the urls and adding in folders that shouldn't have been in a big spiral i.e. /salons/uk/teeth became something like /salons/uk/teeth/salons/edinburgh/hair/teeth...
Anyway, we think the issue is now sorted, but these pages were indexed it seems, and so it looks like Google is still searching for them when it crawls the site. What's my best move? It's the sheers volume (over 13,000) that has me concerned so I thought it best to seek some expert advice before continuing.
Thanks in advance!
-
As it's all sorted now, I really wouldn't worry about them too much. You can use the remove URL functionality in WMT, but this is a manual process so I wouldn't do this. If I were in your position, I'd probably just let the pages keep 404ing'. After a bit, Google will usually stop trying to recrawl the 404 pages. Right now they are probably trying to recrawl incase the 404 was an accident.
If it's causing a bandwidth problem, you can solve with a robots.txt as suggested earlier.
-
Hi Philip!
If these URL's are already indexed, you should 301 Redirect them to the right URL (if they by chance have some inbound links). You could also try the URL removal tool from Google (see https://support.google.com/webmasters/answer/1663416) if all you want is to get rid of them.
Good luck, hope this helps.
//Anders
-
Hi Philip,
If all the urls have the same URL pattern, I would give it a try adding the structure to the robots.txt so you'll prevent Google from crawling the pages. Even better would be if you could add the noindex tags to the page.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge number of crawl anomalies and 404s - non- existent urls
Hi there, Our site was redesigned at the end of January 2020. Since the new site was launched we have seen a big drop in impressions (50-60%) and also a big drop in total and organic traffic (again 50-60%) when compared to the old site. I know in the current climate some businesses will see a drop in traffic, however we are a tech business and some of our core search terms have increased in search volume as a result of remote-working. According to search console there are 82k urls excluded from coverage - the majority of these are classed as 'crawl anomaly' and there are 250+ 404's - almost all of the urls are non-existent, they have our root domain with a string of random characters on the end. Here are a couple of examples: root.domain.com/96jumblestorebb42a1c2320800306682 root.domain.com/01sportsplazac9a3c52miz-63jth601 root.domain.com/39autoparts-agency26be7ff420582220 root.domain.com/05open-kitchenaf69a7a29510363 Is this a cause for concern? I'm thinking that all of these random fake urls could be preventing genuine pages from being indexed / or they could be having an impact on our search visibility. Can somebody advise please? Thanks!
Technical SEO | | nicola-10 -
Internal link is creating duplicate content issues and generating 404s from website crawl.
Not sure what the best way to describe it but the site is built with Elementor page builder. We are finding out that a feature that is included with a pop modal window renders an HTML code as so: Click So when crawled I think the crawling is linking itself for some reason so the crawl returns something like this: xyz.com/builder/listing/ - what we want what we don't want xyz.com/builder/listing/ xyz.com/builder/listing/%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9/ xyz.com/builder/listing/%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9//%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9/ so you'll notice how that string in the HREF is appended each time and it loops a couple times. Could I 301 this issue, what's the best way to go about handling something like this? It's causing duplicate meta descriptions/content errors for some listing pages we have. I did add a rel='nofollow' to the anchor tag with JavaScript but not sure if that'll help.
Technical SEO | | JoseG-LP0 -
Thousands of links coming from an iframe
We have an iframed calculator on one website (www.renewablesguide.co.uk) which has a text link to another of our websites (www.solarguide.co.uk) which is where the calculator originates. We allow other sites to embed the calculator which gives us the benefit of a followed link back to our site. However in the case of renewablesguide (which we own) we've added a tab to the calculator on every page which GWT shows up as 24 000 links from this site hitting the Solar Guide homepage. As the link is held within an iframe would this amount of links be seen as spammy?
Technical SEO | | holmesmedia0 -
1,300,000 404s
Just moved a WordProcess site over to a new host and skinned it. Found out after the fact that the site had been hacked - the db is clean. I did notice at first there were a lot of 404s being generated, so I setup a script to capture and then return a 410 page gone - and then the plan was to submit them to have them removed from the index - thinking there was a manageable number But, when I looked at Google WebMaster Tools there was over 1,300,000 404 errors - see attachment. My puny attempt to solve this problem seems to need more of an industrial size solution. My question, is that what would be the best way to deal with this? Not all of the pages are indexed in google - only 637 index but you can only see about 150 in the index. Where bing is another story saying that over 2,700 pages index but only can see about 200. How is this affecting any future rankings - they do not rank well, as I found out because of very slow page load speed and of course the hacks? The link profile looking at Google is OK, and there are no messages in Google Webmaster tools. am5cMz2
Technical SEO | | Runner20090 -
Hundreds of Thousand Spammy Backlinks Overnight
Hello,
Technical SEO | | JDLitchfield
I have a client who unfortunately got breached (not sure how) and as a result 6 HTML files promoting gucci bags and Louis Vuitton bags were put in the root.
I found the files within a week of them being put there but what I didn't realise (and only found yesterday when looking at the backlink profile) was that there are literally hundreds if not thousands of spammy domains pointing at these files. Some of the sites 404 but some are posts on other bloggers sites who auto accept comments and they total 10,000 links so impossible to remove. Question is: Will Google understand what has happened and ignore these links (especially because the pages don't exist on the server?) Should I use the Disavow tool to block these 1000 odd domains (can it do any harm?) and more links are being found every day so do I just keep doing it? Is there another way to explain to Google what has happened? Your help would be greatly appreciated. Thanks
James0 -
Webmaster Tools finding phantom 404s?
We recently (three months now!) switched over a site from .co.uk to .com and all old urls are re-directing to the new site. However, Google Webmaster tools is flagging up hundreds of 404s from the old site and yet doesn't report where the links were found, i.e. in the 'Linked From' tab there is no data and the old links are not in the sitemap. SEOmoz crawls do not report any 404s. Any ideas?
Technical SEO | | Switch_Digital0 -
Thousands of 503 Errors
I was just checking Google Webmaster Tools for one of the first times (I know this should have been a regular habit). I noticed that on Feb 8th we had almost 80K errors of type 503. This is obviously very alarming because as far as I know our site was up and available that whole day. This makes me wonder if there is a firewall issue or something else that I'm not aware of. Any ideas for the best way to determine what's causing this? Thanks, Chris
Technical SEO | | osports0 -
Google indexing thousands crazy search results with %25253
In GWT I started seeing very strange pages indexed a few weeks, and Google is no reporting over 21,000 of pages (blocked by robots.txt) with weird URLs like this: http://www.francesphotography.com/?s=no-results:no-results%25252525252525253Ano-results%2525252525252525253Ano-results%252525252525252525253Ano-results%252525252525252525253Ano-results%252525252525252525253Ano-results%252525252525252525253Ano-results%25252525252525252525253Ano-results%25252525252525252525253Ano-results%2525252525252525252525253Adanna&cat=no-results http://www.francesphotography.com/?s=no-results:no-results%2525253Ano-results%25252525253Ano-results%25252525253Ano-results%25252525253Ano-results%2525252525253Ano-results%25252525252525253Ano-results%25252525252525253Ano-results%25252525252525253Adanna&cat=no-results The current robots.txt looks like this: User-agent: *
Technical SEO | | BoulderJoe
Disallow: /wp-content Disallow: /wp-admin Disallow: /wp-includes
Disallow: /data
Disallow: /slideshows
Disallow: /page/*/?s=
Disallow: /?s=
Disallow: /search This website is running an up to date WP install with Yoast's Google Analytics and SEO plug-in. I can't point to anything specific that happened with the site when these URLs started appearing even after I modified the robots.txt. What can be done to try and stop Google from creating and indexing these goofy URLs? I see lots of sites having this issue when I search in Google, but no one seems to have a solution.0