How do I fix the 500 error when trying to use the page optimization tool?
-
I keep getting an error when using the page optimization tool - Moz staff replied when I used the chatbot and said that they're receiving a 500 error from my server and to whitelist pagella however my server is not blocking anything. I don't know how to fix this issue any ideas?
I've attached a picture of the error message I'm receiving for reference.
-
Start a new moz campaign with https as the URL
-
This is the best guide
Use this setup, you don't want to duplicate your web site HTTP to HTTPS if needed.
-
Put your http URL in search & https in replace
i hope this will help
https://devnet.kentico.com/articles/url-redirection
https://devnet.kentico.com/articles/real-world-examples---part-ii
-
Yes we're using Kentico.
-
Hi Brittany,
are you useing a CMS
here is information on ms azure
-
Hi Tom,
Yes we just moved to https. When I look at the Moz campaign it just says it's tracking the subdomain business.gogoair.com. It doesn't say whether its http or https, so I started a new campaign using https and tried using the page optimization tool and got the same error as before.
We are hosted on Azure using a Web Application Gateway, are there any specific things we need to do in moz to set this up properly?
Thanks,
Brittany
-
You just made the move to https right?
I ran a 301 check http 301’s to https
google shows that the site is not yet In the SERPS as https
what I think is happening is that you have to run a search & replace for all http URL’s making them https URLs in the database so the bot will not have to crawl http and do a 301 to https.
I will test useing DeepCrawl if you say ok?
Is the moz campaign setup to crawl http or https?
The site should work just fine with a search & replace
I will send you the DeepCrawl
hope Le that helps,
Tom
-
This error is occurring on all our URLs
-
Thanks I will test it dis this happen on all ur or just one or two?
-
Sure our domain is https://business.gogoair.com
-
If it's 301, perhaps you'll get lucky and the htaccess will be the cause.
-
This might be a 301 please send the URL via PM?
-
Can you send me your domain? PM or here if you're ok with that?
-
Hi Tom,
Thanks for your response, we ran our site through the links you provided and didn't have any issues. My developer is still not seeing any issues in our error logs on our side either. Any other ideas on how to fix this?
Thanks,
Brittany
-
Try running your site through https://technicalseo.com/seo-tools/fetch-render/ or https://redbot.org or https://www.screamingfrog.co.uk/seo-spider/ only for 500 pages
I have never seen any web application firewall make 500 errors it sounds to me like your server is underpowered or is going through some sort of code issue definitely look through your logs.
If you can't get through redbot.org call the developer
Hope this helps,
Tom
-
My experiences with a code 500 error on the Wordpress platform, have in most circumstances been caused by a memory limit issue set forth by Wordpress. Assuming your using wordpress the way you would increase your memory limit is within wp-config.php by changing the values on the memory limit to something along these lines:
define( 'WP_MEMORY_LIMIT', '256M' );
I'm pretty sure even the AWS ec2 micro has enough ram to cover 256. If it's not that, I would just run down the list of diagnostics:
https://www.lifewire.com/500-internal-server-error-explained-2622938
-
Hi there,
Tawny from Moz's Help Team here. I can confirm that we are getting a 500 response from your server. Unfortunately, other than making sure you're not blocking AWS and that Rogerbot is allowed, I'm not sure what else you can do. Have you checked over your server logs with your web developer to see if you can find any additional information as to why your server responded with a 500 error when we tried to crawl your site?
If you've got any more questions, feel free to reach out to us at help@moz.com and we'll do our best to sort things out!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Random Reporting of 403 Forbidden Errors
Randomly MOZ reports that hundreds of 403 forbidden errors are cropping up for our website. None of these pages are forbidden, and load just fine. Why does MOZ keep reporting these issues?
Product Support | | USATCorp0 -
Duplicate Content on a page that isn't duplicate?
So, I've been working on a site for a while, and they recently had one of their blog posts marked as "duplicate content" on Moz Pro: https://ashlandbreastpumps.com/blog/breast-milk-storage-guidelines/ Reviewing this post, there's nothing here to suggest that it should be considered duplicate. Link is marked canonical, there are no weird domain shenanigans to cause confusion, the content is distinct, and there's a lot of content there that prevents it from being overcome by coding white noise (So "more content" wouldn't be a solution here). So I'm trying to figure out why this particular post was flagged. Did Moz Pro make a mistake?
Product Support | | YYSeanBrady0 -
My duplicate pages are mostly Tag pages...what are best practices?
As I'm looking through my Moz Pro reports on Pages with Duplicate Content, almost all the results are from the automatically created "tag" pages from my blog. I.e., takeflyte.com/flyte/tags/kickapps Should I worry about this? Does it have a negative impact in my search visibility? Should I be using canonical tags on these pages (and if so, pointing them where if there's multiple pages that use the same tags?) How would you recommend handling this issue?
Product Support | | flyte0 -
Analytics account will not stayed connected, listed fixes do not work.
I've had my MOZ account for a while and have just dealt with this issue for a while now but it is really getting annyoing. My Google Analytics account will not stay connected. Eveery time I refresh a page or click to a different page, I get a notification that reads "Our connection to your Google Analytics account was lost. Don't worry, you won't lose any data. Please reauthorize now." I click the link to reauthorize and it says it was successful, But then as soon as I refresh the page or click to a new page withing Moz Pro, the message appears again. I've tried the fixes listed in all the other questions about this issue Follow this link https://accounts.google.com/b/0/IssuedAuthSubTokens. This page displays the current OAuth tokens you are using. Once you reach the page, simply press the revoke button (illustrated on this screenshot: http://screencast.com/t/vjh3KrjRRIe) for services that you are not using right now, as well as your current Moz token. Once you are done with that, simply go back to your campaign settings, disconnect your GA profile and reconnect (I recommend you use a fresh incognito window). And they do not work. Please help me solve this very annyoing issue!
Product Support | | mr_w0 -
Showing 302 redirection error instead of 404
Moz is showing 302 error instead of 404 in the Crawl Diagnostics Summary report. But there is no such page uploaded ever to the site. Please let me know why this is happening. eg: http://www.zco.com/images/2d-game-development.aspx
Product Support | | zco_seo0 -
Duplicate Page Content Report on Moz - Still ranking in Google Results?
Hi, I am experiencing 2 issues with the Duplicate Page Content in Moz. Every week it is notifying me of new duplicate content - so it seems to be missing duplicate content each week and the crawl is never above the 5000-6000 page mark, which means that it is under 10K which is the limit, so presumably everything is crawled in that one go so surely it should detect all of the dupe content on 1 crawl as opposed to having to do various crawls to detect it no? The dupe content report shows me pages that indeed have duplicate content but after checking, these are ranking on Google for their terms... ? Is the duplicate page content report incremental ? Will it add more duplicate content and increase the report every week or does it just show that week's results? If so, it's a bit like chasing my tail... Help!
Product Support | | bjs20100