Crawl Diagnostics, 57 5XX (Server Error)'s
-
Hi guys,
My name is Rob, and I work at Burst! Creative Group in Vancouver BC. We are having issues with the 5XX(server Error)'s. We have 57 of them, and can't figure out why! We have never had this problem before, but have recently added a blog to our website, and figure that this addition may have caused some problems. I'm hoping that I can get some helpful information from you MOZers to eliminate these crawl errors. You can take a look at our website at www.burstcreativegroup.com . Thank you in advance for all of your input!
Rob
-
Also, have your traffic levels increased noticeably? Could be a server/hosting issue maybe
-
Thank you for your input Lavellester, I will try this and let you know the outcome!
-
You could try fetching a few of the the pages manually as googlebot to see how they are performing. 5xx errors can often be random/intermittent glitches.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 errors & old unused pages
I am using shopify and I need to delete some old pages which are coming up as 404 errors (product no longer available!) does anyone know where you go to delete these pages which are no longer needed?
Web Design | | carleyb0 -
Have you changed 100's of links on your site? Tell me the why's, the how's and what's!
Hello there. If you've changed 100's of links, then I'd like for you to contribute to this thread. I've created a new URL structure for a website with 500+ posts in an effort to make it more user friendly, and more accessible to crawlers. I was just about to pull the trigger, when I started reading up on the subject and found that I might have a few surprises waiting for me around the corner. The status of my site. 500 posts 10 different categories 50+ tags No Backlinks No recent hits (according to Google Analytics) No rankings. I'm going to keep roughly 75% of the posts, and put them in different (new) categories to strengthen SEO for the topic which I'd like to rank multiple categories for, and also sorted a list with content which I'd like to 410. Created new structure created new categories Compiled list of old URLs, and new URLs New H1, Meta Title & Descriptions New tags It looks simple on paper, but I've got problems executing it. **Question 1. **What do I need to keep in mind when deleting posts, categories, and tags - besides 410, Google URL removal? Question 2. What do I do with all the old posts that I am going to re-direct? Each post has between 10-15 internal links. I've started manually removing each link in old posts before 301'ing them. The reason I'm doing this is control the UX, as well as internal link juice to strengthen main categories. Am I on the right path? On a side note, I've prepared for the 301'ing by changing the H1's, meta data and adding alt text to images. But I can't help but to think that just deleting the old posts, and copying over the content to the new url (with the original dates set) would be a better alternative. Any contribution to this thread would be greatly appreciated. Thank you in advance.
Web Design | | Dan-Louis1 -
What's the world's best hosting?
Hello folks, I'm looking at hosting options. In your opinion, what's the best provider out there and why? Cheers, Gill.
Web Design | | Cannetastic0 -
Any second opinions as to why our organic search website traffic hasn't recovered from website rebrand (domain change, website redesign)?
I am hoping to see if anyone in the Moz community would be able to help troubleshoot or lend any advice on a major organic search traffic issue we've been experiencing over the last 8 months. In a nutshell, we decided our ~4.5-year-old business needed to undergo a rebrand in October 2015. After changing domains & redesigning our website (more below), our search-driven sessions have dropped 20% in 2016 v.s. 2015. We made quite a few on-site modifications (with some success) post-redesign but are still deep in a rut and not sure what more we can do to recover. I've listed my theories below as to why we're still suffering this hit. If anyone could weigh in on these and/or share any other troubleshooting ideas, I would greatly, greatly appreciate it (and owe you a lunch/beverage of your choice the next time I'm in your city!). ****Backlinks - despite our efforts to 301 all links, I sense we have lost many backlinks. According to Open Site Explorer, our old domain has 1,172 backlinks (some from some very authoritative pages domains), 1,068 of which are passing link equity. In contrast, our new domain has 367 backlinks, 321 are passing link equity, and very few overlap with our old domain. Domain Age - we may have lost much of our reputation with Google as our new domain is much younger than our old domain (1-year-old v.s. 5.5 years old). Domain Name - although I thought to have common keywords in one's domain was a myth, I am now questioning that belief. Our old domain contained a popular, topical keyword and our new domain is derived from a term that is topical, but very uncommon. New URLs - our developer has insisted all links were moved to the new domain, but I have a hunch they were not. When conducting a "site search" (i.e. "site:websitename.com"), the new domain returns 7,740 results. Prior to our switch, a site search with the old domain yielded 30,000+ results. 404s - we found and fixed 100-200 404'd links after the domain switch. We still see a few pop-up today and I'm wondering if this is a red flag in Google's eyes. For a little more background too, here are the nitty gritty details with a rough timeline: Pre-October 12, 2015 - registered new domain and designed the new website on Wordpress, while researching a range of articles and resources for a successful site migration (e.g. this and this Moz guide). October 12, 2015 - flipped the switch on the website design, domain, minor content reorganization, and social handles. We announced the change to our audience via an article, newsletter, and social; informed Google Webmaster Tools (GWT) of the new address, 301'd all links from the old to the new domain, and submitted new sitemap in GWT. October 12 - 16, 2015 - traffic is normal, everything seems to be okay. October 17, 2015 - search traffic drops by 54% v.s. the same day of week pre-rebrand. October 26, 2015 - search traffic rises, so now only down by 30% v.s. the same day of week pre-rebrand. November/December 2015 - re-added numerous elements from the old website such as category, tag, and page pagination and a few sidebar modules that linked to other important pages and tags. Search traffic rises slightly in November (down 27% year-on-year), dips again in December (down 31% year-on-year). January 2016 - today (June 17, 2016) - we published more content on a daily basis and search traffic fluctuates around the 20% versus the same period in 2015. January 2016 - down 23% year-on-year February 2016 - down 17% year-on-year March 2016 - down 20% year-on-year April 2016 - down 21% year-on-year May 2016 - down 21% year-on-year June 2016 (until the 17th) - down 23% year-on-year Thank you all in advance for your time and help, please let me know if you have any questions!
Web Design | | nick490 -
Duplicate Content? Designing new site, but all content got indexed on developer's sandbox
An ecommerce I'm helping is getting a complete redesign. Their developer had a sandbox version of their new site for design & testing. Several thousand products were loaded into the sandbox site. Then Google/Bing crawled and indexed the site (because developer didn't have a robots.txt), picking up and caching about 7,200 pages. There were even 2-3 orders placed on the sandbox site, so people were finding it. So what happens now?
Web Design | | trafficmotion
When the sandbox site is transferred to the final version on the proper domain, is there a duplicate content issue?
How can the developer fix this?0 -
What does it mean that "too many links" show up in my report - but I'm not seeing them?
I've noticed that on the crawl report for my site, www.imageworkscreative.com, "too many links" is showing up as a chronic problem. Reviewing the pages cited as having this issue, I don't see more than 100 links. I've read that sometimes, websites are unintentionally cloaking their links, and I am concerned that this is what might be happening on my site. Some example pages from my crawl report are: http://www.imageworkscreative.com/blog/, http://www.imageworkscreative.com/blog/10-steps-seo-and-sem-success/index.html, and http://www.imageworkscreative.com/blog/business-objectives-vs-user-experience/index.html. Am I having a cloaking issue or is something else going on here? Any insight is appreciated!
Web Design | | ScottImageWorks0 -
Image URL's and naming
We're re-platforming on Magento and wondering about our images. 1. Should I be concerned about 301 redirects for my images. 2. Is there a "best practice" path for images? or is just the name important? Right now, all our images are in /meta/images/sm or /lg or /xlg. Since we're re-platforming, we're wondering if we should change the urls. But, I'm assuming this would require all of them to have 301 redirects and with all the other redirects, I'm not sure this is really feasible. thanks for any suggestions on this.
Web Design | | centralvacuumstores0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0