"Moz encountered an error on one or more pages on your site" Error
-
I have been receiving this error for a while:
"Moz encountered an error on one or more pages on your site"
It's a Multi-Lingual Wordpress website, the robots.txt is set to allow crawlers on all links and I have followed the same process for other website I've done yet I'm receiving this error for this site.
-
If you want to be doubly sure, you can try Fetch as Google (under Crawl) in Search Console.
And you can also try crawling with a 3rd party tool such as Screaming Frog (https://www.screamingfrog.co.uk/seo-spider/)
If these respond appropriately, it may be worth trying a crawl again, later (as the Crawl Test suggests) - it may just be a temporary glitch in the matrix.
Update us if things don't settle down or you get unexpected results following above.
Don't forget the HELP! button's there if you need it (it's right next to the PANIC! button on your keyboard https://moz.com/help/contact
-
-
Can you elaborate? What Moz tool where you using at the time (It sounds like a crawl test?)
Can you expand the message to reveal the error?
Is your site hosted on a CMS such as SquareSpace, for example?
Is it crawlable using similar external tools?
Do you have any errors within Google Search Console? - Always good to start here.
Please get back to us with some more details and we'll be better positioned to help.
Cheers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console "Text too small to read" Errors
What are the guidelines / best practices for clearing these errors? Google has some pretty vague documentation on how to handle this sort of error. User behavior metrics in GA are pretty much in line with desktop usage and don't show anything concerning Any input is appreciated! Thanks m3F3uOI
Technical SEO | | Digital_Reach2 -
Rel="canonical" again
Hello everyone, I should rel="canonical" my 2 languages website /en urls to the original version without /en. Can I do this from the header.php? Should I rel="canonical" each /en page (eg. en/contatti, en/pagina) separately or can I do all from the general before the website title? Thanks if someone can help.
Technical SEO | | socialengaged0 -
Rel="canonical"
HI, I have site named www.cufflinksman.com related to Cufflinks. I have also install WordPress in sub domain blog.cufflinksman.com. I am getting issue of duplicate content a site and blog have same categories but content different. Now I would like to rel="canonical" blog categories to site categories. http://www.cufflinksman.com/shop-cufflinks-by-hobbies-interests-movies-superhero-cufflinks.html http://blog.cufflinksman.com/category/superhero-cufflinks-2/ Is possible and also have any problem with Google with this trick?
Technical SEO | | cufflinksman0 -
Noindex search result pages Add Classifieds site
Dear All, Is it a good idea to noindex the search result pages of a classified site?
Technical SEO | | te_c
Taking into account that category pages are also search result pages, I would say it is not a good idea, but the whole information is in the sitemap, google can index individual listings (which are index, follow) anyway. What would you do? What kind of effects has in the indexing of the site, marking the search result pages as "search results" with schema.org microdata? Many thanks for your help, Best Regards, Daniel0 -
My pages says it has 16 errors, need help
My pages says it has 16 errors, and all of them are due to duplicate content. How do I fix this? I believe its only due to my meta tag description.
Technical SEO | | gaji0 -
Should XML sitemaps include *all* pages or just the deeper ones?
Hi guys, Ok this is a bit of a sitemap 101 question but I cant find a definitive answer: When we're running out XML sitemaps for google to chew on (we're talking ecommerce and directory sites with many pages inside sub-categories here) is there any point in mentioning the homepage or even the second level pages? We know google is crawling and indexing those and we're thinking we should trim the fat and just send a map of the bottom level pages. What do you think?
Technical SEO | | timwills0 -
Help: Google Time Spent Downloading a Page, My Site is Slow
All, My site: http://www.nationalbankruptcyforum.com shows an average time spent downloading a page of 1,489 (in milliseconds) We've had spikes of well over 3,000 and lows of around 980 (all according to WMT). I understand that this is really slow. Does anyone have some suggestions as to how I could improve load times? Constructive criticism welcomed and encouraged.
Technical SEO | | JSOC0