5XX (Server Error) on all urls
-
Hi
I created a couple of new campaigns a few days back and waited for the initial crawl to be completed. I have just checked and both are reporting 5XX (Server Error) on all the pages it tried to look at (one site I have 110 of these and the other it only crawled the homepage). This is very odd, I have checked both sites on my local pc, alternative pc and via my windows vps browser which is located in the US (I am in UK) and it all works fine.
Any idea what could be the cause of this failure to crawl? I have pasted a few examples from the report
|
500 : TimeoutError
http://everythingforthegirl.co.uk/index.php/accessories.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/bags.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/gloves.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/purses.html 500 1 0 500 : TimeoutError
http://everythingforthegirl.co.uk/index.php/accessories/sunglasses.html | 500 | 1 | 0 |
Am extra puzzled why the messages say time out. The server dedicated is 8 core with 32 gb of ram, the pages ping for me in about 1.2 seconds. What is the rogerbot crawler timeout?
Many thanks
Carl
-
You're absolutely correct, hopefully this answered your question!
-
Thanks, will check out that plugin. So, in other words, the pages are loading fine for the user but sending out an error to the bots instead of the loaded ok message. That doesn't sound good!!
On the plus side, at least it has stopped Roger noticing some of the pages have up to 600 links on them because of all the retailer and manufacturer filtering options!!
Many thanks, Carl
-
Hi Carl,
You're a lucky man (sarcastic), your paging are loading just normally but are indeed giving the wrong status code: 500 for me. This is probably caused by some of the settings in Magento or you server as the normal status code for working pages should be a 200 OK.
That's probably also why Rogerbot didn't timeout on the pages but got a 500 while the pages were working. Good luck fixxing this!
Btw, I highly recommend using the Redirect Plugin for Chrome by Ayima.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL, Subdomain and Root Domain Structure
Various URL Structure
Moz Pro | | Mark_Ch
mydomain.co.uk
www.mydomain.co.uk
http://www.mydomain.co.uk
http://mydomain.co.uk
mydomain.co.uk/index.html
www.mydomain.co.uk/index.html
http://www.mydomain.co.uk/index.html
http://mydomain.co.uk/index.html HTACCESS File Index Rewrite RewriteRule ^index.(htm|html|php) http://www.mydomain.co.uk/ [R=301,L]
RewriteRule ^(.)/index.(htm|html|php) http://www.mydomain.co.uk/$1/ [R=301,L]
RewriteCond %{HTTP_HOST} ^mydomain.co.uk
RewriteRule ^(.)$ http://www.mydomain.co.uk/$1 [R=301,L] Google WMT Setting: Configuration | Settings
Preferred domain: radio check on "don't set a preferred domain" SEOMoz Open Site Explorer
mydomain.co.uk - (301 Redirect) [No Data] PA38 DA30
http://www.mydomain.co.uk/index.html - (301 Redirect) [No Data] PA23 DA30 Majestic Site Explorer
Number of Referring Domains & External Backlinks vary between the following instances:
URL: http://www.mydomain.co.uk
SUBDOMAIN: www.mydomain.co.uk
ROOT DOMAIN: mydomain.co.uk
Question
I have set up my htaccess file to rewrite "Various URL Structure" to www.mydomain.co.uk. However when i view metrics in Majestic SEO, the url / Subdomain / Root Domain all differ. Why is this happening?
Is this harming my site?
What is common practice when defining URL Structure? Any other quality advise and implementation structure would be much appreciated. Regards Mark0 -
404: Error - MBP Ninja Affiliate
Hello, I use the plugin MBP Ninja Affiliate to redirect links. I did Crawl Diagnostics and it appears 404: Error, but the link is working, it exists. Why Crawl Diagnostics appear 404: Error?
Moz Pro | | antoniojunior0 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090 -
Canonical link on canonical url
This might seem a bit of an odd one, but we seem to be going around in circles on this when using the on page optimizer tool. We have an ecommerce site (magento) which by default is putting a canonical link in the header on every product page. For example; www.example.com/product1.html has the But when we run the on page optimiser tool, we're losing points on the critical section for not having canonical set correctly. If we remove the tag, we get the tick and the a grade, but then further down the report we lose a tick for not using canonical links. What are we missing here?
Moz Pro | | andyjsi0 -
Dead links-urls
What is the quickest way to get Google to clean up dead
Moz Pro | | 1step2heaven
link? I have 74,000 dead links reported back, i have added a robot txt to
disallow and added on Google list remove from my webmaster tool 4 months ago.
The same dead links also show on the open site explores. Thanks0 -
Why do I keep getting "more than one canonical URL tag" on-page factor when, in fact, there is always only one?
The following are pages that SEOMOZ says have "more than one canonical URL tag" but they all have only one. Can someone help me understand this?http://www.lasercenterny.com/Laser-Hair-Removal-Binghamton/tabid/1950/Default.aspxhttp://www.lasercenterny.com/Hair-Removal-Binghamton-NY/tabid/1949/Default.aspxhttp://www.lasercenterny.com/Hair-Removal-Binghamton/tabid/1948/Default.aspx
Moz Pro | | SmartWebPros0 -
Duplicate page titles are the same URL listed twice
The system says I have two duplicate page titles. The page titles are exactly the same because the two URLs are exactly the same. These same two identical URLs show up in the Duplicate Page Content also - because they are the same. We also have a blog and there are two tag pags showing identical content - I have blocked the blog in robots.txt now, because the blog is only for writers. I suppose I could have just blocked the tags pages.
Moz Pro | | loopyal0 -
How do I delete a url from a keyword campaign
I have a couple of urls that are associated with the keywords in my campaign. They are no longer valid so how do I remove them?
Moz Pro | | PerriCline0