Best way to fix a whole bunch of 500 server errors that Google has indexed?
-
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not.
In any case, there are now thousands of these pages in their index that error out.
If I wanted to simply remove them all from the index, which is my best option:
-
- Disallow all 1,000 or so pages in the robots.txt ?
-
- Put the meta noindex in the headers of each of those pages ?
-
- Rel canonical to a relevant page ?
-
- Redirect to a relevant page ?
-
- Wait for Google to just figure it out and remove them naturally ?
-
- Submit each URL to the GWT removal tool ?
-
- Something else ?
Thanks a lot for the help...
-
-
If you already fixed the error, then just wait for Google to figure things out on their end. Having those errors in GWT isn't going to hurt you.
-
Wouldn't you be showing 404's instead of 500's in the first place?
If the old URL's are still showing in the index, I'd reckon you'd want those 301'd to relevant pages anyways, at worst, at least a resource-heavy 404 page popping up rather than a 500.
-
4/5 with a bit of 7
What you need to do is return the correct response code (I'm guessing that is either 404 or 410) then let google reindex those URLs. That way Google knows that those urls are no longer valid. However, if those URLs have links or get traffic then you might want to 301 them.
Let's look at a couple the other options though - it is interesting.
-
This will stop google re-visiting those URLs,Therefore it will always think they are there.
-
No index confirms they are there, but tells google not to return them in results. Again this isn't correct and they will continue to return to and re-check those URLs
-
Unless the content is very close, this is unlikely to work. It is also wrong (because presumably they are not the same thing)
-
If they URLs have a common (and exclusive) directory it may be an option to submit that. It might though not be a good idea to submit lots individually - Matt Cutts has suggested this in the past.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What server issues might cause temporary and repeated Soft 404/500 Errors that appear to be functioning correctly when checked later from Google Webmaster Tools?
We are experiencing unknown server issues (we think) which are causing Soft 404/500 errors at unpredictable times on 2 websites. When we check on the pages, they’re fine but still show errors in Moz/Search Console. What are some measures we can take to protect from this or figure out what is causing this? Example URL for Soft 404 Error: https://www.advancedtraveltherapy.com/jobs/any/occupational-therapist/any/ Example URL for 500 Error: https://www.advancedtraveltherapy.com/job-detail/ms/physical-therapist/87529740/ Example URL for Soft 404 Error: https://www.advancedtravelnursing.com/search/searchresults.php?jobState=CA&tempType=g&specialties= Example URL for 500 Error: https://www.advancedtravelnursing.com/job/ma/registered-nurse/emergency-room/87108662/
Technical SEO | | StaffingRobot0 -
How to fix duplicate page content error?
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. The example of links that has duplicate page content error are http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348855 http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348852 These are not duplicate pages. There are some values that are different on both pages like listing # , equipnet tag # , price. I am not sure how do highlight the different things the two page has like the "Equipment Tag # and listing #". Do they resolve if i use some style attribute to highlight such values on page? Please help me with this as i am not really sure why seo is thinking that both pages have same content. Thanks !!!
Technical SEO | | RGEQUIPNET0 -
Http VS https and google crawl and indexing ?
Is it true that https pages are not crawled and indexed by Google and other search engines as well as http pages?
Technical SEO | | sherohass0 -
Best source to keep abreast of Google Alogorith Changes
Good morning from 13 degress C about to chuck it down wetherby UK... I want to make sure i keep my finger on the pulse regarding Google Algorithmic changes. What is the best source to make sure your kept up to date with Googles ongoing tweeks to its Algorithm (apart from SEO moz). Thanks, David
Technical SEO | | Nightwing0 -
Best way to setup large site for multi language
Hello, I am setting up a new site which is going to be very large, over 250,000 products. Most of our customers are in the UK (45%), the rest are from various European countries and the USA. Unfortunately we only have a team of two people writing content for these pages in English. I would value some input on the best way to setup my website structure for ranking. Obviously the best would be individual country oriented domains I.e. domain.fr domain.de domain.co.uk . However we wouldnt have the time to create content for every page and most pages would contain the same content as the English domain. Would I get a penalty for this from google? The second choice is to follow the example of overstock.com and pull in information relating to each country I.e. currency and delivery time. this would be a lot easier but I am concerned that the lack of geo focus would effect my rankings. Does any one have any ideas?
Technical SEO | | DavidLenehan0 -
Best way to handle redirection for products that come in and out of inventory.
We have a large volume of products that rotate seasonally. From an SEO perspective we are looking for the best method on how to handle these issues. Currently when crawler or user encounters a URL to a product that is no longer in inventory we are looking at two things. One, the request comes in and send a 200 to a page that says ITEM NOT FOUND. Option 2, is simply send them to a 404. The product may or may not be put back into production. What is the best method to handle this?
Technical SEO | | CC_Dallas0 -
How to remove a sub domain from Google Index!
Hello, I have a website having many subdomains having same copy of content i think its harming my SEO for that site since abc and xyz sub domains do have same contents. Thus i require to know i have already deleted required subdomain DNS RECORDS now how to have those pages removed from Google index as well ? The DNS Records no more exists for those subdomains already.
Technical SEO | | anand20100 -
How to fix 404 (Client Error) errors in wordpress blog?
hey A very quick question...after analyzed my wp blog I've found "34" 404 (Client Error) Errors and I don't know how to fix it, do you know how?? *I renew html code of 404 of my wordpress blog.
Technical SEO | | akitmane1