How do I fix the 500 error when trying to use the page optimization tool?
-
I keep getting an error when using the page optimization tool - Moz staff replied when I used the chatbot and said that they're receiving a 500 error from my server and to whitelist pagella however my server is not blocking anything. I don't know how to fix this issue any ideas?
I've attached a picture of the error message I'm receiving for reference.
-
Start a new moz campaign with https as the URL
-
This is the best guide
Use this setup, you don't want to duplicate your web site HTTP to HTTPS if needed.
-
Put your http URL in search & https in replace
i hope this will help
https://devnet.kentico.com/articles/url-redirection
https://devnet.kentico.com/articles/real-world-examples---part-ii
-
Yes we're using Kentico.
-
Hi Brittany,
are you useing a CMS
here is information on ms azure
-
Hi Tom,
Yes we just moved to https. When I look at the Moz campaign it just says it's tracking the subdomain business.gogoair.com. It doesn't say whether its http or https, so I started a new campaign using https and tried using the page optimization tool and got the same error as before.
We are hosted on Azure using a Web Application Gateway, are there any specific things we need to do in moz to set this up properly?
Thanks,
Brittany
-
You just made the move to https right?
I ran a 301 check http 301’s to https
google shows that the site is not yet In the SERPS as https
what I think is happening is that you have to run a search & replace for all http URL’s making them https URLs in the database so the bot will not have to crawl http and do a 301 to https.
I will test useing DeepCrawl if you say ok?
Is the moz campaign setup to crawl http or https?
The site should work just fine with a search & replace
I will send you the DeepCrawl
hope Le that helps,
Tom
-
This error is occurring on all our URLs
-
Thanks I will test it dis this happen on all ur or just one or two?
-
Sure our domain is https://business.gogoair.com
-
If it's 301, perhaps you'll get lucky and the htaccess will be the cause.
-
This might be a 301 please send the URL via PM?
-
Can you send me your domain? PM or here if you're ok with that?
-
Hi Tom,
Thanks for your response, we ran our site through the links you provided and didn't have any issues. My developer is still not seeing any issues in our error logs on our side either. Any other ideas on how to fix this?
Thanks,
Brittany
-
Try running your site through https://technicalseo.com/seo-tools/fetch-render/ or https://redbot.org or https://www.screamingfrog.co.uk/seo-spider/ only for 500 pages
I have never seen any web application firewall make 500 errors it sounds to me like your server is underpowered or is going through some sort of code issue definitely look through your logs.
If you can't get through redbot.org call the developer
Hope this helps,
Tom
-
My experiences with a code 500 error on the Wordpress platform, have in most circumstances been caused by a memory limit issue set forth by Wordpress. Assuming your using wordpress the way you would increase your memory limit is within wp-config.php by changing the values on the memory limit to something along these lines:
define( 'WP_MEMORY_LIMIT', '256M' );
I'm pretty sure even the AWS ec2 micro has enough ram to cover 256. If it's not that, I would just run down the list of diagnostics:
https://www.lifewire.com/500-internal-server-error-explained-2622938
-
Hi there,
Tawny from Moz's Help Team here. I can confirm that we are getting a 500 response from your server. Unfortunately, other than making sure you're not blocking AWS and that Rogerbot is allowed, I'm not sure what else you can do. Have you checked over your server logs with your web developer to see if you can find any additional information as to why your server responded with a 500 error when we tried to crawl your site?
If you've got any more questions, feel free to reach out to us at help@moz.com and we'll do our best to sort things out!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Random Reporting of 403 Forbidden Errors
Randomly MOZ reports that hundreds of 403 forbidden errors are cropping up for our website. None of these pages are forbidden, and load just fine. Why does MOZ keep reporting these issues?
Product Support | | USATCorp0 -
Crawl error robots.txt
Hello, when trying to access the site crawl to be able to analyze our page, the following error appears: **Moz was unable to crawl your site on Nov 15, 2017. **Our crawler was banned by a page on your site, either through your robots.txt, the X-Robots-Tag HTTP header, or the meta robots tag. Update these tags to allow your page and the rest of your site to be crawled. If this error is found on any page on your site, it prevents our crawler (and some search engines) from crawling the rest of your site. Typically errors like this should be investigated and fixed by the site webmaster. Can help us? Thanks!
Product Support | | Mandiram0 -
I have been not using MOZ, my log in can give details,and i was charged for plan of 149$.please refund my money.never did used any tool,it s all inactive .
i have been not using MOZ, my log in can give details,and i was charged for plan of 149$.please refund my money.never did used any tool,it s all inactive .
Product Support | | rush70720 -
How to use moz to find spam backlinks
I am new to MOZ pro. I have lots of spam backlinks from a company that blackhatted me about 5 years ago. Can MOZ list these for me? I don't know how. Thanks
Product Support | | TomHW0 -
Why is Moz Crawl Diagnostics labelling pages as duplicate when they appear to be different?
Moz Crawl Diagnostics is flagging some pages on the Doorfit website as duplicate, yet the page content is completely different and not identical. Example. Page: http://www.doorfit.co.uk/locks-security/secondary-security Duplicate: http://www.doorfit.co.uk/seals-and-sealants?cat=279 Does anybody have any suggestions as to why this might be the case? Thanks
Product Support | | A_Q0 -
Received emails about new ranking, crawl and on page reports, but nothing new shows.
Yesterday morning I had emails about updated crawl, ranking and on page reports being available however nothing in my dashboard is newer than 2/21. I waited through yesterday to see if things changed, logged in and out etc but nothing new has shown up. Any ideas on why that is the case?
Product Support | | sea2dca0 -
Duplicate Page Content identification
Hi everyone, I'm a new user of this amazing tool MOZ analytics and I have a question about the crawl diagnostics. How can I indentify the part of my pages considered as duplicate content (pages with high priority issues) ? Witch text is duplicated ? Thank you very much for your help. 5aTAMR4
Product Support | | jvottero0 -
Duplicate Page Title
I am seeing pages show up in my campaign Errors report for Duplicate Page Title. One for example states that this page: https://www.inprocorp.com/AboutInPro/AboutUs/HistoryTimeline/tabid/316/Default.aspx has a duplicate page title as this page: https://www.inprocorp.com/AboutInPro/HistoryTimeline/tabid/316/Default.aspx I have a redirect setup so that if you access page #2, you are redirected to page #1. So how do I get errors like this to drop off my report? Have I not fixed the problem sufficiently for the crawler? Thanks, John
Product Support | | jzabkowicz0