Best way to fix a whole bunch of 500 server errors that Google has indexed?
-
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not.
In any case, there are now thousands of these pages in their index that error out.
If I wanted to simply remove them all from the index, which is my best option:
-
- Disallow all 1,000 or so pages in the robots.txt ?
-
- Put the meta noindex in the headers of each of those pages ?
-
- Rel canonical to a relevant page ?
-
- Redirect to a relevant page ?
-
- Wait for Google to just figure it out and remove them naturally ?
-
- Submit each URL to the GWT removal tool ?
-
- Something else ?
Thanks a lot for the help...
-
-
If you already fixed the error, then just wait for Google to figure things out on their end. Having those errors in GWT isn't going to hurt you.
-
Wouldn't you be showing 404's instead of 500's in the first place?
If the old URL's are still showing in the index, I'd reckon you'd want those 301'd to relevant pages anyways, at worst, at least a resource-heavy 404 page popping up rather than a 500.
-
4/5 with a bit of 7
What you need to do is return the correct response code (I'm guessing that is either 404 or 410) then let google reindex those URLs. That way Google knows that those urls are no longer valid. However, if those URLs have links or get traffic then you might want to 301 them.
Let's look at a couple the other options though - it is interesting.
-
This will stop google re-visiting those URLs,Therefore it will always think they are there.
-
No index confirms they are there, but tells google not to return them in results. Again this isn't correct and they will continue to return to and re-check those URLs
-
Unless the content is very close, this is unlikely to work. It is also wrong (because presumably they are not the same thing)
-
If they URLs have a common (and exclusive) directory it may be an option to submit that. It might though not be a good idea to submit lots individually - Matt Cutts has suggested this in the past.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTP 500 Internal Server Error, Need help
Hi, For a few days know google crawlers have been getting 500 errors from our dedicated server whenever they try to crawl the site. Using the "Fetch as Google" tool under health in webmaster tools, I get "Unreachable page" every time I fetch the homepage. Here is exactly what the google crawler is getting: <code>HTTP/1.1 500 Internal Server Error Date: Fri, 21 Jun 2013 19:52:27 GMT Server: Apache/2.2.15 (CentOS) X-Powered-By: PHP/5.3.3 X-Pingback: [http://www.communityadvocate.com/xmlrpc.php](http://www.communityadvocate.com/xmlrpc.php) Connection: close Transfer-Encoding: chunked Content-Type: text/html; charset=UTF-8 http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> My url is [http://www.communityadvocate.com](http://www.communityadvocate.com/)</code> and here's the screenshot from Goolge webmater http://screencast.com/t/FoWvqRRtmoEQ How can i fix that? Thank you
Technical SEO | | Vmezoz0 -
Google WMT continues reporting fixed 404s - why?
I work with a news site that had a heavy restructuring last spring. This involved removing many pages that were duplicates, tags, etc. Since then, we have taken very careful steps to remove all links coming into these deleted pages, but for some reason, WMT continues to report them. By last August, we had cleared over 10k 404s to our site, but this lasted only for about 2 months and they started coming back. The "linked from" gives no data, and other crawlers like seomoz aren't detecting any of these errors. The pages aren't in the sitemap and I've confirmed that they're not really being linked from from anywhere. Why do these pages keep coming back? Should I even bother removing them over and over again? Thanks -Juanita
Technical SEO | | VoxxiVoxxi0 -
Which factors are effect on Google index?
Mywebsite have 455 URL submitedbut only 77 URLs are indexed. How can i improve more indexed URL?
Technical SEO | | magician0 -
Webmaster Tools Server Error
We recently did a build to our site and after the build the build one of the softwares that we are using changed. This caused our server errors to go into the thousands. right now google webmaster tools gave us a list of top 1,000 pages with errors and we fixed them all is there a way to see the rest of the errors?
Technical SEO | | DoRM0 -
How to fix duplicate page content error?
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. The example of links that has duplicate page content error are http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348855 http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348852 These are not duplicate pages. There are some values that are different on both pages like listing # , equipnet tag # , price. I am not sure how do highlight the different things the two page has like the "Equipment Tag # and listing #". Do they resolve if i use some style attribute to highlight such values on page? Please help me with this as i am not really sure why seo is thinking that both pages have same content. Thanks !!!
Technical SEO | | RGEQUIPNET0 -
When doing the ranking report I see my site showing up on google with out the www in front. So the report is not picking it up how do I fix that?
The ranking report is not picking up my site even though it's there. It would seem that the www. Is missing from the site on google so it's not registering in the report. How do i fix this?
Technical SEO | | ursalesguru0 -
Disappeared from Google with in 2 hours of webmaster tools error
Hey Guys I'm trying not to panic but....we had a problem with google indexing some of our secure pages then hit those pages and browsers firing up security warning, so I asked our web dev to have at look at it he made the below changes and within 2 hours the site has drop off the face of google “in web master tools I asked it to remove any https://freestylextreme.com URLs” “I cancelled that before it was processed” “I then setup the robots.txt to respond with a disallow all if the request was for an https URL” “I've now removed robots.txt completely” “and resubmitted the main site from web master tools” I've read a couple of blog posts and all say to remain clam , test the fetch bot on webmasters tools which is all good and just wait for google to reindex do you guys have any further advice ? Ben
Technical SEO | | elbeno1 -
Google has not indexed my site in over 4 weeks, what's the problem?
We recently put in permanent redirects to our new url, but Google seems to not want to index the new url. There was no problems with the old url and the new url is brand new so should have no 'black marks' against it. We have done everything we can think off in terms of submitting site maps, telling google our url has changed in webmaster tools, mentioning the new url on social sites etc...but still nothing. It has been over 4 weeks now since we set up the redirects to the url, any ideas why Google seems to be choosing not to index it? Thanks
Technical SEO | | cewe0