Sufficient Words in Content error, despite having more than 300 words
-
My client has just moved to a new website, and I receive "Sufficient Words in Content" error over all website pages , although there are much more than 300 words in those pages. for example:
- https://www.assuta.co.il/category/assuta_sperm_bank/
- https://www.assuta.co.il/category/international_bank_sperm_donor/
I also see warnings for "Exact Keyword Used in Document at Least Once" although there is use of them in the pages..
The question is why can't MOZ crawler see the pages contents?
-
Well, shoot. Apparently, I'm not quite is familiar with UTF-8 characters and that distinction as I thought. Totally my mistake.
The line for our tools falls closer to english character sets. Even seemingly small modifiers like accents can throw off our ability to accurately count or detect matches on pages. So, when it comes to Hebrew characters, we simply don't have a good way to handle it.
-
But the page is utf-8.. Take a look at the source code: view-source:https://www.assuta.co.il/category/assuta_sperm_bank/
Moz did a pretty good job with the old website - before it was changed, but now all my grades are 50
although the site is utf-8 (on a .net framework if that matters)
Can you re-check that please?
I think
Michal
-
Hey Michal!Â
I'm really sorry you're running into this issue! Unfortunately, it looks to be cropping up because of the non-latin characters being used on the page. Our crawler has a very difficult time interpreting non-UTF-8 characters, and often reports counts and matches poorly when looking at pages comprised of them.I'm terribly sorry for the inconvenience! It's definitely something that we're looking to address down the road, but I'm afraid we don't have the resources to improve that functionality at the moment.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Fix Repeating 404 Error on Blog
I've been getting this same 404 Error for a ton of pages on my blog (blog.twowayradiosfor.com) out of nowhere and I can't figure out how to fix it. I have about 500 of them that are experiencing the same issue (as shown in the image I've attached/linked to). It has the correct link, then the part that gets flagged as 404 adds a /TwoWayRadiosFor.com at the end, which is apparently the issue. Is there a reason these have just now appeared even though the blog posts are from years ago? Is there an easy way to fix it? Thanks, Sawyer dF6mUJQ
Link Explorer | | AllChargedUp0 -
404 Error - Please Help us
When we checked valuable Top pages, we noticed two types of 404 pages listed in our domain Example 1 : www.test.com/www.test.com/ecommerce.html Example 2 : www.test.com/test/ecommerce.html But we do not see any such 404 page errors in the Google Webmaster tool. Moz Top Pages Section only shows these as errors So please advise, if these are major errors or not? If these are errors, please help us to fix this as we do not have such URLs in our domain Awaiting your urgent help
Link Explorer | | Intellect0 -
612 : Page banned by error response for robots.txt
Hi all,
Link Explorer | | ME5OTU
I ran a crawl on my site https://www.drbillsukala.com.au  and received the following error  "612 : Page banned by error response for robots.txt." Before anyone mentions it, yes, I have been through all the other threads but they did not help me resolve this issue. I am able to view my robots.txt file in a browser https://www.drbillsukala.com.au/robots.txt.Â
The permissions are set to 644 on the robots.txt file so it should be accessible
My Google Search Console does not show any issues with my robots.txt file
I am running my site through StackPath CDN but I'm not inclined to think that's the culprit One thing I did find odd is that even though I put in my website with https protocol (I double checked), on the Moz spreadsheet it listed my site with http protocol. I'd welcome any feedback you might have. Â Thanks in advance for your help.
Kind regards0 -
How Do You Deal With Duplicate Content On A Retail Site
Hi Guys We make custom portfolios and boxes and every time we have a Moz site crawl, Rogerbot always returns a number of Duplicate Content Issues relating to different products. As per the image below (which I hope is visible?!) Rogerbot has flagged duplicate content onto one product that relates to 5 other different products. For instance there is duplicate content for an A4 Leather Portfolio and an A3 Leather Portfolio and an 11"x17" Leather Portfolio. I can't redirect or canonicalise to just the A4 Portfolio as they are all individually different products. The information on each page although similar, is relevant to each of the products, so rewriting a different blurb on each product page, will not be user friendly for our customers. I could ignore the duplicate content issues, but then that isn't good practise (and also makes for a very unsatisfactory looking Dashboard!) Any ideas?? Nick wzQLgBl
Link Explorer | | nick_HandCo1 -
How do I fix 885 Duplicate Page Content Errors appearing in my Moz Report due to categories?
Hi There, I want to set up my Moz report to send directly to a client however there are currently 885 duplicate page content errors displaying on the report. These are mostly caused by an item listed in multiple 'categories' and each category is a new pages/URL. I guess my questions are: 1. Does Google see these as duplicate page content? Or does it understand the categories are there for navigation purposes. 2. How do I clear these off my Moz report so that the client doesn't panic that there are some major issues on the site Thanks for your advice.
Link Explorer | | skehoe0 -
Why is moz telling me I have duplicate content, but neither the content nor the urls are duplicates?
I just upgraded our website to a new one. Â This is the first crawl of the new website. Â It is telling me I have 24 Critical issues, all of which are duplicate content errors. Thing is, the urls are not duplicates and the content on the page is not duplicated either. Â Example, here is one error where it says there are two duplicates: <colgroup><col width="595"></colgroup>
Link Explorer | | damon1212
| http://winterguardtarps.com/portfolio-item/props-10 |
| http://winterguardtarps.com/portfolio-item/props-2 |
| http://winterguardtarps.com/portfolio-item/props-7 | There are photos in our portfolio, and none of them are the same. I'm a bit of a noob, but what am I missing here?0 -
Error getting your data in moz ose
Why am I receiving a "There was an error getting your data" in moz ose? Everything worked fine yesterday but now I'm having trouble getting link metrics for my site.
Link Explorer | | TitanDigital1 -
Learn how to use Open Site Explorer's Top Pages report to help inform your content marketing efforts. Get your Daily SEO Fix!
With the Top Pages report, you can see the pages on your site (and your competitorsâ) that are top performers. The pages are sorted by Page Authority - a prediction of how well a specific page will rank in search engines - and also metrics for linking root domains, inbound links, HTTP status and social shares. Be sure to watch today's Daily SEO Fix video tutorial to learn how to use Open Site Explorer's Top Pages report to analyze the competitions' content marketing efforts and to inform your own. This video is part of The Moz Daily SEO Fix tutorial series--Moz tool tips and tricks in under 2 minutes. To watch all of our videos so far, and to subscribe to future ones, make sure to visit the Daily SEO Fix channel on YouTube.
Link Explorer | | kellyjcoop3