Moz Crawl: Can't check page optimization error https
-
Help needed, when I try to do a page optimization check i get the following error :
The URL you entered does not appear to be returning a page successfully. Please make sure that you've entered the URL of valid, working page.
But i can do a site crawl, what should be the problem?
Checked with frog seo spider and add no problem, robots.txt its also clean.
Anyone knows what can be wrong?
Thanks
-
Thanks Kristina,
Working fine. Keep up the good work.
-
Hi again!
I just wanted to follow-up with you here, as well as in your support ticket, to let you know that this has been resolved. We were experiencing some issues with On-Page reports for https:// sites but this has since been resolved.
Thank you for bringing this to our attention. If you're still experiencing issues, please don't hesitate to report them to our team by sending a note to help@moz.com and we'll be happy to assist further!
Have a great day.
-Kristina -
Hi Kristina,
Thank you for your feedback.
Hope you'll find what's happening, i'll wait for the ticket.
-
Hi there! I apologize for the delay.
I've had a chance to speak to our technical team on this and we're definitely seeing the same error on our end, and we're not able at first glance to be able to tell WHY it's happening.
I'm going to open a support ticket on your behalf and then send this one up to be triaged further to see why this is happening. I apologize we weren't able to get this quickly resolved for you, as I would've hoped to have.
You should get an email here shortly with the details on your support ticket and I'll be updating you through that ticket as we move through this one.
Sorry again for the inconvenience but hopefully we'll be able to sort this one out for you soon.
Thank you,
- Kristina
-
-
Hi there! Kristina from Moz's Help Team here. Sorry to hear about the trouble!
Can you provide the URL and the keyword you were plugging into the Page Optimization tool so I can take a closer look at the issue you're experiencing?
In the future, you can contact my team directly with any technical issues by emailing help@moz.com. Doing so typically renders a faster turn-around time than logging technical questions into the Q&A forum.
I look forward to helping you solve this mystery!
Thank you,
-Kristina
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When I click into the moz bar on our website www.essexparts.com it shows that the URL is Canadian and shows a Canada flag. The website and company are in the US. Can someone please explain this?
When I click into the moz bar on our website www.essexparts.com it shows that the URL is Canadian and shows a Canada flag. The website and company are in the US. Can someone please explain this? Also does the location have any bearing on how it shows up in google? XNQOQoJ
API | | Tindol0 -
Moz Not Accounting for Google Tags
The main problem I have is that the most serious issues facing my site have been dealt with using Google Tags, but Moz doesn't take this into account (this has been verified by a Moz employee) so I can't get an accurate account of what crawl issues need fixing and which ones are resolved. Does anyone know a way to make Moz Tools account for Google Tags?
API | | moon-boots0 -
September's Mozscape Update Broke; We're Building a New Index
Hey gang, I hate to write to you all again with more bad news, but such is life. Our big data team produced an index this week but, upon analysis, found that our crawlers had encountered a massive number of non-200 URLs, which meant this index was not only smaller, but also weirdly biased. PA and DA scores were way off, coverage of the right URLs went haywire, and our metrics that we use to gauge quality told us this index simply was not good enough to launch. Thus, we're in the process of rebuilding an index as fast as possible, but this takes, at minimum 19-20 days, and may take as long as 30 days. This sucks. There's no excuse. We need to do better and we owe all of you and all of the folks who use Mozscape better, more reliable updates. I'm embarassed and so is the team. We all want to deliver the best product, but continue to find problems we didn't account for, and have to go back and build systems in our software to look for them. In the spirit of transparency (not as an excuse), the problem appears to be a large number of new subdomains that found their way into our crawlers and exposed us to issues fetching robots.txt files that timed out and stalled our crawlers. In addition, some new portions of the link graph we crawled exposed us to websites/pages that we need to find ways to exclude, as these abuse our metrics for prioritizing crawls (aka PageRank, much like Google, but they're obviously much more sophisticated and experienced with this) and bias us to junky stuff which keeps us from getting to the good stuff we need. We have dozens of ideas to fix this, and we've managed to fix problems like this in the past (prior issues like .cn domains overwhelming our index, link wheels and webspam holes, etc plagued us and have been addressed, but every couple indices it seems we face a new challenge like this). Our biggest issue is one of monitoring and processing times. We don't see what's in a web index until it's finished processing, which means we don't know if we're building a good index until it's done. It's a lot of work to re-build the processing system so there can be visibility at checkpoints, but that appears to be necessary right now. Unfortunately, it takes time away from building the new, realtime version of our index (which is what we really want to finish and launch!). Such is the frustration of trying to tweak an old system while simultaneously working on a new, better one. Tradeoffs have to be made. For now, we're prioritizing fixing the old Mozscape system, getting a new index out as soon as possible, and then working to improve visibility and our crawl rules. I'm happy to answer any and all questions, and you have my deep, regretful apologies for once again letting you down. We will continue to do everything in our power to improve and fix these ongoing problems.
API | | randfish11 -
In lue of the canceled Moz Index update
Hey Moz, Overall we love your product and are using it daily to help us grow, part of that has been to rely on the Moz Index for DA and PA as well as places where we are doing positive linking through genuine partnerships and reviews of clients. We were really excited to see any the results for this month as we have been partner linked from lots of high reputation sites and google seems to agree as our rankings are moving up weekly. The question from our marketing team is, since a significant part of Moz will not be available to us this month, will there be any compensation handed out to the paying community. PS: I am an engineer and I know how you have probably lost a very large set of data which cant simply be re-crawled over night but Moz Pro is not a cheap product and we do expect it to work. Source: https://moz.com/products/api/updates Kind Regards.
API | | SundownerRV0 -
How much attention should I pay to Moz's DA/PA?
Hola! I've been optimising a site since October and our hard work has yielded a sizeable increase in organic traffic, revenue, quality, relevant links and Search Metrics scoring since commencing the campaign. After yesterday's Moz update, the DA has dropped slightly and a number of pages' PAs have dropped significantly (i.e. from 27 to 17). So here are my questions: My 'white hat' optimisation is clearly working. The site is enjoying more than 100% year-on-year increase in organic traffic and we're currently pulling in more organic visitors than ever before. Why is Moz's score not reflecting this? Some of the pages that have seen sizeable PA drops have had their URLs changed since the last Moz update. For example, I've optimised a URL from www.mysite.com/cases-covers to www.mysite.com/phone-cases to coincide with search volumes. I've added optimised content to this page too, but the PA has dipped from 27 to 17. A 301 redirect has been correctly added, and this is evident by a PA of 17 and not zero, which is what a brand new page would have. Am I paying too much attention to Moz's scores? It's a bit disheartening to see a drop after a lot of hard work. However, I guess the only thing that really counts is an increased volume of search traffic and revenue, right? Cheers, Lewis
API | | PeaSoupDigital0 -
Does on-page grader have an API ?
Hi, I would very much like to include the on-page grader output into my SEO tools. Is there an API for that? thanks James
API | | KMdayJob0 -
Top Pages metrics in OSE
Not sure if this is an API question or a feature request, but wondering if other folks had a way to do this: In OSE there is the dashboard for a specific URL that is entered into the search bar, giving you metrics at a glance on it. But I often find myself going to the Top Pages tab to get a sense of the domain as a whole. First off, wouldn't it be nice/is there a way to build my own "dashboard" based on info from that section? Specifically, I'd love to see at a glance the number of "top pages" that exist (many websites are well under the 10,000 page limit for this section, but there's no quick glance metric showing that) One thing that would be very handy for me would be a breakdown of HTTP Status info across the whole domain, as being able to see the raw total of different statuses (and the percentages of each based on the total number of pages) would be really helpful, giving me a sense if I should dig into any issues before exporting the list to CSV. I've found myself needing this type of info for multiple domains at once, so what would be REALLY cool would be a Google Doc where I could paste in different domains in one column, and this info being returned in other columns. I've searched through the Q&A and didn't find anything like this, and I didn't know how easy/hard any of this would be to do, but I was wondering if anyone else had a sense of how to solve this problem and how feasible it would be to tackle it. Thanks!
API | | John-E-Turner0 -
Batch URL Error 413
Hello, i am using the free mozscape api to get the domain and page authority of urls. i am batching the url as shown in the sample code however i have over 500 URL i need to check and when running the request i get the following: stdClass Object ( [status] => 413 [error_message] => Too many urls in batch. Batches must be less than or equal to 200 urls. ) When i change it to 200 urls the request works fine, now is there a way to enable me to batch all 500 urls at once? i did read that the beta api is capable of batching urls in one request : http://moz.com/blog/400-higher-throughput-mozscape-api-now-in-beta-and-seeking-testers Has this been implemented yet into the current api? Thanks
API | | pauledwards0