Wordpress 503 errors
-
We've launched a new website and i'm see a lot of 503 errors when i run Screaming Frog Seo Spider. Does anyone know how to solve this problem?
wwww.besteluieraanbiedingen.nl
Thanks!
-
It's indeed a paid option. I think it’s the most important that the server is responding correctly at the moment and you find out what caused the errors. If you check any pages with a 503 status manually, do they respond?
A other option is to check the server logs if you can find out anything there or call your hosting provider and ask them why the server is giving 503 errors. If you didn’t cause the problems it’s important to know what did since they may come back and cause your website to stop working correctly again.
Webmaster tools will only give you an overview of what Google finds at the moment they visit your website. So it might not give you the most accurate answer to what caused the problem.
-
i tried but it's only available for licensed users
Wait for what webmaster tools gives back i guess?
-
Hi Happy SEO,
An 503 error is a server error which can indicate a server overload.
It could be possible that you created these errors because you are using screaming frog. This tool makes a lot of requests in a very short period. I would try to run the tool again with speed adjusted to 1 threads with 2 URL/s.
You can find the speed settings under configuration. If this doesn’t resolve the problem feel free to post a response.
PS. When I use the tool I only get a response from the homepage. This does indicate the server is blocking the access.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help recover lost traffic (70%) from robots.txt error.
Our site is a company information site with 15 million indexed pages (mostly company profiles). Recently we had an issue with a server that we replaced, and in the processes mistakenly copied the robots.txt block from the staging server to a live server. By the time we realized the error, we lost 2/3 of our indexed pages and a comparable amount of traffic. Apparently this error took place on 4/7/19, and was corrected two weeks later. We have submitted new sitemaps to Google and asked them to validate the fix approximately a week ago. Given the close to 10 million pages that need to be validated, so far we have not seen any meaningful change. Will we ever get this traffic back? How long will it take? Any assistance will be greatly appreciated. On another note, these indexed pages were never migrated to SSL for fear of losing traffic. If we have already lost the traffic and/or if it is going to take a long time to recover, should we migrate these pages to SSL? Thanks,
On-Page Optimization | | akin671 -
Website server errors
I launched a new website at www.cheaptubes.com and had recovered my search engine rankings as well after penguin & panda devestation. I'm was continuing to improve the site Sept 26th by adding caching of images and W3 cache but moz analytics is now saying I went from 288 medium issues to over 600 and i see the warning "45% of site pages served 302 redirects during the last crawl". I'm not sure how to fix this? I'm on WP using Yoast SEO so all the 301's I did are 301's not 302's. I do have SSL, could it be Http vs Https? I've asked this question before and two very nice people replied with suggestions which I tried to implement but couldn't, i got the WP white screen of death several times. They suggested the code below. Does anyone know how to implement this code or some other way to reduce the errors I'm getting? I've asked this at stackoverflow with no responses. "you have a lot of http & https issues so you should fix these with a bit of .htaccess code, as below. RewriteEngine On
On-Page Optimization | | cheaptubes
RewriteCond %{HTTPS} !=on
RewriteRule ^.*$ https://%{SERVER_NAME}%{REQUEST_URI} [R,L] You also have some non-www to www issues. You can fix these in .htaccess at the same time... RewriteCond %{HTTP_HOST} !^www.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] You should find this fixes a lot of your issues. Also check in your Wordpress general settings that the site is set to www.cheaptubes.com for both instances." When I tried to do as they suggested it gave me an internal server error. Please see the code below from .htaccess and the server error. I took it out for now. BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule ^.$ https://%{SERVER_NAME}%{REQUEST_URI} [R,L]
RewriteCond %{HTTP_HOST} !^www. RewriteRule ^(.)$ http://www.%{HTTP_HOST}/$1 [R=301,L]</ifmodule> END WordPress Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. Please contact the server administrator, webmaster@cheaptubes.com and inform them of the time the error occurred, and anything you might have done that may have caused the error. More information about this error may be available in the server error log. Additionally, a 500 Internal Server Error error was encountered while trying to use an ErrorDocument to handle the request.0 -
Robot.txt file issue on wordpress site.
I m facing the issue with robot.txt file on my blog. Two weeks ago i done some development work on my blog. I just added few pages in robot file. Now my complete site seems to be blocked. I have checked and update the file and still having issue. The search result shows that "A description for this result is not available because of this site's robots.txt – learn more." Any suggestion to over come with this issue
On-Page Optimization | | Mustansar0 -
Duplicate content errors
I have multiple duplicate content errors in my crawl diagnostics. The problem is though that i already took care of these problems with the canonical tag but MOZ keeps saying there is a problem. For example this page http://www.letspump.dk/produkter/56-aminosyre/ has a canonical tag, but moz still says it has an error. Why is that?
On-Page Optimization | | toejklemme0 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Post url not matching with post title ( wordpress)
I have this site called searchoflife.com on which I have noticed the post urls are not matching with the post title. For Example:Post Title - A Dialogue With NaturePost URL - http://searchoflife.com/dialogue-nature-2013-09-12 Words like 'A' and 'with' are not present in the post URL. This has been the trend since a few days. After investigating I found out that it was due to my plugin SEO ultimate which is actually creating post slugs automatically for the post urls. So my question is whether it is advisable to use post slugs instead of the full post url. Does it affect the SERPS for my site?
On-Page Optimization | | toxicpls0 -
20 x '400' errors in site but URLs work fine in browser...
Hi, I have a new client set-up in SEOmoz and the crawl completed this morning... I am picking up 20 x '400' errors, but the pages listed in the crawl report load fine... any ideas? example - http://www.morethansport.co.uk/products?sortDirection=descending&sortField=Title&category=women-sports clothing
On-Page Optimization | | Switch_Digital0 -
Can u suggest me good one page optimization tool for wordpress based websites?
Can u suggest me good one page optimization tool for wordpress based websites?
On-Page Optimization | | dineshameh0