Best way to fix a whole bunch of 500 server errors that Google has indexed?
-
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not.
In any case, there are now thousands of these pages in their index that error out.
If I wanted to simply remove them all from the index, which is my best option:
-
- Disallow all 1,000 or so pages in the robots.txt ?
-
- Put the meta noindex in the headers of each of those pages ?
-
- Rel canonical to a relevant page ?
-
- Redirect to a relevant page ?
-
- Wait for Google to just figure it out and remove them naturally ?
-
- Submit each URL to the GWT removal tool ?
-
- Something else ?
Thanks a lot for the help...
-
-
If you already fixed the error, then just wait for Google to figure things out on their end. Having those errors in GWT isn't going to hurt you.
-
Wouldn't you be showing 404's instead of 500's in the first place?
If the old URL's are still showing in the index, I'd reckon you'd want those 301'd to relevant pages anyways, at worst, at least a resource-heavy 404 page popping up rather than a 500.
-
4/5 with a bit of 7
What you need to do is return the correct response code (I'm guessing that is either 404 or 410) then let google reindex those URLs. That way Google knows that those urls are no longer valid. However, if those URLs have links or get traffic then you might want to 301 them.
Let's look at a couple the other options though - it is interesting.
-
This will stop google re-visiting those URLs,Therefore it will always think they are there.
-
No index confirms they are there, but tells google not to return them in results. Again this isn't correct and they will continue to return to and re-check those URLs
-
Unless the content is very close, this is unlikely to work. It is also wrong (because presumably they are not the same thing)
-
If they URLs have a common (and exclusive) directory it may be an option to submit that. It might though not be a good idea to submit lots individually - Matt Cutts has suggested this in the past.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Product Listings - is it worth indexing the whole product catalogue?
I'm working on a site that has around 500 product listings. This is for a rental company without any sort of ecommerce platform, so, there's no prices, no adding a product to a cart, etc. Also, there are no different sizing / color options for each product, so each product is the canonical version. After some restructuring, we're starting to see a lot of 404s and just some general mess. I have a couple of thoughts. My first is to just noindex each product. We hardly get any direct traffic to an individual product page, and if they land anywhere related to products, it's usually a category page. If I noindex the products, I don't have to worry about the 404s. My second is to implement the rel=canonical tag on each product to correspond to its primary category. While this is sort of liberal use of the canonical tag, I'm thinking that it could help drive more organic traffic to the category pages. Does anyone have any insight or thoughts on this? Thank you very much!
Technical SEO | | Savage-Solutions0 -
Can Google index the text content in a PDF?
I really really thought the answer was always no. There's plenty of other things you can do to improve search visibility for a PDF, but I thought the nature of the file type made the content itself not-parsable by search engine crawlers... But now, my client's competitor is ranking for my client's brand name with a PDF that contains comparison content. Thing is, my client's brand isn't in the title, the alt-text, the url... it's only in the actual text of the PDF. Did I miss a major update? Did I always have this wrong?
Technical SEO | | LindsayDayton0 -
Did anyone else noticed Google index bug?
Noticed page indexation drop in Search Console for most of my sites. Guys from Search Engine Land seem to know about that: http://selnd.com/1YqiOoQ Did anyone else noticed something weird?
Technical SEO | | solvid1 -
Google index graph duration in Google Webmaster Tools
Hello guys, I wonder, my sites are currently being indexed every 7 days, exactly. At Index Status page in GWT. However, this new site gets updated almost everyday, how can I ask google to index faster and more frequently/almost daily? Is it about SItemap.xml frequency ? I changed it today to Daily. Thanks!
Technical SEO | | mdmoz0 -
Missing files in Google and Bing Index
We uploaded our sitemap a while back and we are no longer see around 8 out of 33 pages. We try submitting the sitemap again about 1-2 weeks ago and there but no additional pages are seen when I do site: option in both search engines. I reviewed the sitemap and it includes all the pages. I am not seeing any errors in the seo moz for these pages. Any ideas what I should try?
Technical SEO | | EZSchoolApps0 -
Will a drop in indexed pages significantly affect Google rankings?
I am doing some research into why we were bumped from Google's first page into the 3rd, fourth and fifth pages in June of 2010. I always suspected Caffeine, but I just came across some data that indicates a drop in indexed pages from 510 in January of that year to 133 by June. I'm not sure what happened but I believe our blog pages were de-indexed somehow. What I want to know is could that significant drop in indexed pages have had an effect on our rankings at that time? We are back up to over 500 indexed pages, but have not fully recovered our first page positions.
Technical SEO | | rdreich490 -
Duplicate Page Content and Title for product pages. Is there a way to fix it?
We we're doing pretty good with our SEO, until we added product listing pages. The errors are mostly Duplicate Page Content/Title. e.g. Title: Masterpet | New Zealand Products MasterPet Product page1 MasterPet Product page2 Because the list of products are displayed on several pages, the crawler detects that these two URLs have the same title. From 0 Errors two weeks ago, to 14k+ errors. Is this something we could fix or bother fixing? Will our SERP ranking suffer because of this? Hoping someone could shed some light on this issue. Thanks.
Technical SEO | | Peter.Huxley590