Crawl Diagnostics, 57 5XX (Server Error)'s
-
Hi guys,
My name is Rob, and I work at Burst! Creative Group in Vancouver BC. We are having issues with the 5XX(server Error)'s. We have 57 of them, and can't figure out why! We have never had this problem before, but have recently added a blog to our website, and figure that this addition may have caused some problems. I'm hoping that I can get some helpful information from you MOZers to eliminate these crawl errors. You can take a look at our website at www.burstcreativegroup.com . Thank you in advance for all of your input!
Rob
-
Also, have your traffic levels increased noticeably? Could be a server/hosting issue maybe
-
Thank you for your input Lavellester, I will try this and let you know the outcome!
-
You could try fetching a few of the the pages manually as googlebot to see how they are performing. 5xx errors can often be random/intermittent glitches.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm doing a crawl analysis for a website and finding all these duplicate URLs with "null" being added to them and have no clue what could be causing this.
Does anyone know what could be causing this? Our dev team thinks it's caused by mobile pages they created a while ago but it is adding 1000's of additional URLs to the crawl report and being indexed by Google. They don't see it as a priority but I believe these could be very harmful to our site. examples from URL string:
Web Design | | julianne.amann
uruguay-argentina-chilenullnull/days
rainforests-volcanoes-wildlifenullnull/reviews
of-eastern-europenullnullnullnull/hotels0 -
What are the downsides and/or challenges to putting page paths (www.example.com/pagepath) on a different server?
Hi, Our company is organized into three different segments and our development team recently needed to switch a portion of the business to subdomain because they wanted to move to a different server platform. We are now seeing the impact of moving this segment of the business to a subdomain on the main domain. SEO is hurting and our MOZ score has dropped significantly. One fix they are debating is moving everything back to one domain, but place segments of the business on different page paths and hosting specific paths on different servers. I.e. the main domain could be www.example.com hosted in one location and then www.example.com/segment1 would be hosted on a different server. They are hoping to accomplish this using some sort of proxy/caching redirection solution. The goal of this change would be to recapture our domain strength. Is this something that is a good option or no? If not, what are the challenges and issues you see arising from doing something like that as I don't know of any other site set up like this. Thanks in advance.
Web Design | | bradgreene0 -
Moving servers which means moving ip address but using the same URL. Would it harm the website's SEO?
Hello everyone, The server (in-house) which we use to host our website is a bit old. We are using CDN77 for our static content. What if I move all our website to the CDN service? meaning I use their storage capability and just have our url point to the IP address they provide. Would that hurt our rankings?
Web Design | | Edgar-Cerecerez0 -
Duplicate Content? Designing new site, but all content got indexed on developer's sandbox
An ecommerce I'm helping is getting a complete redesign. Their developer had a sandbox version of their new site for design & testing. Several thousand products were loaded into the sandbox site. Then Google/Bing crawled and indexed the site (because developer didn't have a robots.txt), picking up and caching about 7,200 pages. There were even 2-3 orders placed on the sandbox site, so people were finding it. So what happens now?
Web Design | | trafficmotion
When the sandbox site is transferred to the final version on the proper domain, is there a duplicate content issue?
How can the developer fix this?0 -
AJAX endpoints returning 404 errors in GWT. Why!?!?
Hi guys, So I'm working through a large dataset of 404 errors and trying to clean up the site's crawl-ability. A piece of the puzzle I can't seem to wrap my head around has to do with AJAX endpoints. It looks like GWT thinks these are URLs that don't exist and, therefore, is reporting them as 404 errors. Anyone experience this before?
Web Design | | brad_dubs0 -
How to put 'Link to this article' HTML code at bottom of article & is it helpful?
Hello, I was thinking about putting a box down at the bottom of my client's main articles that let's the reader easily copy the html code it takes to link to the article they're reading. Maybe I'd put it after the author bio. Do any of you do this? If so, what format do you use? It has to look nice of course. This is a non-techie industry. Thanks.
Web Design | | BobGW0 -
Hard Lessons Learned... What's yours?
So I got a whole lot of help these past two weeks and my rankings have been skyrocketing. Then I decided to start working on the on-page SEO in the lowest category of meaning, specifically on long-tail URLs. So I shortened a few of my best keyword pages so they can be fully indexed... Let's just say that I neglected to remember I had built over 2 years some 30+ PR4-6 links to these pages. Rankings for these keywords dropped from 1-2 US listings to non-existent. Lesson Learned. But I'm still smirking 🙂 What was your big lesson/mistake in the past week?
Web Design | | HMCOE0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0