Unsolved Performance Metrics crawl error
-
I am getting an error:
Crawl Error for mobile & desktop page crawl - The page returned a 4xx; Lighthouse could not analyze this page.
I have Lighthouse whitelisted, is there any other site I need to whitelist? Anything else I need to do in Cloudflare or Datadome to allow this tool to work? -
@bhsiao-0 said in Performance Metrics crawl error:
I am getting an error:
Crawl Error for mobile & desktop page crawl - The page returned a 4xx; Lighthouse could not analyze this page.
I have Lighthouse whitelisted, is there any other site I need to whitelist? Anything else I need to do in Cloudflare or Datadome to allow this tool to work?You're encountering a crawl error with Lighthouse on your website, indicating that the page returned a 4xx error and couldn't be analyzed. While you've whitelisted Lighthouse, there might be additional sites or configurations in Cloudflare or Datadome affecting its functionality. Double-check if any other sites related to Lighthouse need whitelisting and review Cloudflare and Datadome settings to ensure they're not blocking Lighthouse's access. If issues persist, consider seeking assistance from support teams for further troubleshooting.
#PerformanceMetrics #CrawlError #LighthouseTroubleshooting
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Moz crawler not crawling on my site
Hi all, im facing an issue where moz crawler is unable to crawl my site. The following error keeps showing Our crawler was banned by a page on your site, either through your robots.txt, the X-Robots-Tag HTTP header, or the meta robots tag. This is my robots.txt file : https://www.wearefutureheads.com/robots.txt I'm not sure what else am I missing.. can anyone help
Product Support | | teikh0 -
Track SEO performance for specific sub-directories
How can i track performance metrics for a group of subdirectories.
SEO Tactics | | Miradoro
I.e
domain.com/de/en_uk
domain.com/de/de_de
domain.com/at/en_uk
domain.com/at/de_de0 -
Unsolved Strange "?offset" URL found with content crawl issues
I recently recieved a slew of content crawl issues via Moz for URL's that I have never seen before For example:
Moz Pro | | HannahPalamara
Standard URL: https://skilldirector.com/news,
Newly identified URL: https://skilldirector.com/news?offset=1469542207800&category=Competency+Management). Does anyone know where the URL comes from and how to fix it?0 -
Unsolved Rogerbot blocked by cloudflare and not display full user agent string.
Hi, We're trying to get MOZ to crawl our site, but when we Create Your Campaign we get the error:
Moz Pro | | BB_NPG
Ooops. Our crawlers are unable to access that URL - please check to make sure it is correct. If the issue persists, check out this article for further help. robot.txt is fine and we actually see cloudflare is blocking it with block fight mode. We've added in some rules to allow rogerbot but these seem to be getting ignored. If we use a robot.txt test tool (https://technicalseo.com/tools/robots-txt/) with rogerbot as the user agent this get through fine and we can see our rule has allowed it. When viewing the cloudflare activity log (attached) it seems the Create Your Campaign is trying to crawl the site with the user agent as simply set as rogerbot 1.2 but the robot.txt testing tool uses the full user agent string rogerbot/1.0 (http://moz.com/help/pro/what-is-rogerbot-, rogerbot-crawler+shiny@moz.com) albeit it's version 1.0. So seems as if cloudflare doesn't like the simple user agent. So is it correct the when MOZ is trying to crawl the site it uses the simple string of just rogerbot 1.2 now ? Thanks
Ben Cloudflare activity log, showing differences in user agent strings
2022-07-01_13-05-59.png0 -
Unsolved /%25s
Hi Community, has anyone else had a 404 error reported by Moz, where the end of the domain is /%25s? The error comes from my blog home page https://kaydee.net/blog/ But when I look at the source code, I can't see anything that has a space at the end of the URL. I wonder if it is to do with the WordPress search? Thanks in advance for any insight.
Moz Pro | | kaydeeweb0 -
Solved Site Crawl Won't Complete
How can I start/restart a new site crawl? I requested one 2 days ago on one of my sites, and it won't complete. It's only 150 pages -
Product Support | | PaulBarrs0 -
How to block Rogerbot From Crawling UTM URLs
I am trying to block roger from crawling some UTM urls we have created, but having no luck. My robots.txt file looks like: User-agent: rogerbot Disallow: /?utm_source* This does not seem to be working. Any ideas?
Product Support | | Firestarter-SEO0 -
HTTPS (SSL) Error Encountered.
Hi guys, We received the following error from our Moz report, our hosting companies says it's isn't an issue but I wanted to get your feedback as it seems a bit odd. We recently moved eventfull.co.nz to a VPS and setup an EV SSL. We moved another site to the same VPS and added EV SSL and it isn't reporting any issue. Hosting companies feedback below error message | Crawl Error Error Code 804: HTTPS (SSL) Error Encountered Your page requires an SSL security certificate to load (using HTTPS), but the Moz Crawler encountered an error when trying to load the certificate. Our crawler is pretty standard, so it's likely that other browsers and crawlers may also encounter this error. If you have this error on your homepage, it prevents the Moz crawler (and some search engines) from crawling the rest of your site. | | I am just running some tests on the SSL cert installation now, but so far all appears to be fine. I've checked your .htaccess file and there is nothing that should be blocking them, and the logs show nothing unexpected or any SSL failures. SSL Labs test returns an A rating with no errors. One possibility (though, it's a long shot) is that we never increased the HSTS time beyond 3600, so it's possible that the crawler is failing because of this perceived insecurity. However, if there is some other issue going on, increasing the HSTS time would be a bad idea, so I suggest some further monitoring and testing. You may need to contact Moz about the issue and see if they can help. Is there any other evidence of SSL failing to load? Have you experienced any other issues, or is it only the Moz report that is indicating any problem? |
Product Support | | ModowestNZ0