Search Console - Mobile Usability Errors
-
A site I'm looking at for a client had 100's of pages flagged as having Mobile Usability errors in Search Console.
I found that the theme uses parameters in the URLs of some of theme resources (.js/.css) to identify the version strings.
These were then being blocked by a rule in the robots.txt: "Disallow: /*?"
I've removed this rule, and now when I inspect URLs and test the live versions of the page they are now being reported as mobile friendly.
I then submitted validation requests in Search Console for both of the errors ("Text to small" and "Clickable Elements too close")
My problem now, is that the validation has completed and the pages are still being reported as having the errors.
I've double checked and they're find if I inspect them individually.
Does anyone else have experience clearing these issues in Search Console? Any ideas what's going on here!
-
Just to follow this up. We're now seeing the mobile usability error reports gradually being removed from pages at approximagely 100 pages / day.
It just seemed to me that the whole validation request process didn't actually appear to do anything and we just had to wait for the site to be recrawled?!
-
Thanks for your response Daniel. The steps you outlined are exactly what I have done - which is why after requesting validate fix I was surprised that the pages came back with errors still!
I've submitted a validate fix request again so I'll see what happens...
-
I've had the same issue when one of my clients disallowed a directory that contained the CSS and some other scripts.
What you should do is make sure 100% that you have removed any conflicting line from the robots.txt file, then you should go to the robots.txt testing tool from Google and see whether it's updated. If not, you just make a request for it to be updated, and usually, it's done within 30 min. Then you should run the Google mobile-friendly test and see if there are any issues.
If both of the tests return positive results, then you should request at the search console to "validate fix". After that, it usually takes up to a week for it to be removed from the pages with errors.
If it's a small number of links, you can also "text live URL" within the search console, and then "request indexing". As mentioned, if everything goes right, it should take up to one week for the errors to be removed.Daniel Rika - Dalerio Consulting
https://dalerioconsulting.com/
info@dalerioconsulting.com
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Critical crawler errors...4xx
Hey fam, I ran the Critical Crawler Issues and found 9 pages with critical crawler issues. I'm running a wordpress site and looked in the dashboard for Pages and Posts but the links aren't in the dashboard. Can you help fix? Thank you!
Technical SEO | | Myflgreen0 -
Error after scanning with browseo.net
Good day! I have done a scan on my site with browseo.net ( and a few other similar scanners ) and got the mess seen in the screenshot. I've tried deleting all the files in the website folder, replace it with a single image file, but it still shows the same error. What could this mean and should i be worried? P.S Found my answer after contacting the helpful support of browseo.net : It took me some time to figure out what was going on, but it seems as if you are mixing content types. Browsers are quite smart when it comes to interpreting the contents, so they are much more forgiving than we are. Browseo crawls your website and detects that you are setting utf-8 as part of the meta information. By doing so, it converts the content in a different character encoding then what they are supposed to be. In a quick test, I tried to fetch the content type based on the response object, but without any success. So I am suspecting that in reality your content is not utf-8 encoded when you parse it into joomla. The wrong character type is then carried over for the body (which explains why we can still read the header information). All of this explains the error. In order for it to work in browseo, you’d have to set the content type correctly, or convert your own content into utf-8 before parsing. It may be that you are either storing this incorrectly in the database (check your db settings for a different content type other than utf-8) or that other settings are a bit messed up. The good news is, that google is probably interpreting your websites correctly, so you won’t be punished for this, but perhaps something to look into… From Paul Piper VKNNnAL.png?1
Technical SEO | | AlexElks0 -
60,000 404 errors
Do 404 errors on a large scale really matter? I'm just aware that I now have over 60,000 and was wondering if the community think that I should address them by putting 301 redirects in place. Thanks
Technical SEO | | the-gate-films0 -
How to solve this merchant error?
Hello All, In my google merchant suddenly lots of warning appeared i.e. 1) Automatic item updates: Missing schema.org microdata price information 2) Missing microdata for condition Can you please tell me how to solve this errors? Thanks!
Technical SEO | | varo
John0 -
Should I go with a Mobile Site or Responsive Design
So I work with a Video Editing Plugin company and we have hit a bit of a conundrum with our mobile site plan. At first we were going to a stripped down version of our current site since the customer base has yet to purchase off mobile and almost all of the web traffic comes from a full computer. ( As video editing on tablets still leaves a bit to be desired). I was thinking a mobile site, but at the same point, I don't want to have issues when it comes to URLs and what not. Given that a majority of our traffic is non mobile. Would it be better to design a separate stripped down mobile site, or would responsive still be the better choice? Are mobile specific sites becoming old fashion?
Technical SEO | | TodorF.1 -
Are these 'not found' errors a concern?
Our webmaster report is showing thousands of 'not found' errors for links that show up in javascript code. Is this something we should be concerned about? Especially since there are so many?
Technical SEO | | nicole.healthline0 -
Error Reporting
http://pro.seomoz.org/campaigns/33868/issues/18 Rel Canonical Found about 16 hours ago <dl> <dt>Tag value</dt> <dd>http://www.geeks.com/</dd> <dt>Description</dt> <dd>Using rel=canonical suggests to search engines which URL should be seen as canonical.</dd> <dd>We do have rel canonical on some of the pages this report is recommending that we "fix" this issue.</dd> <dd> Rel Canonical Found about 16 hours ago <dl> <dt>Tag value</dt> <dd>http://www.geeks.com/products.asp?cat=MBB</dd> <dt>Description</dt> <dd>Using rel=canonical suggests to search engines which URL should be seen as canonical.</dd> </dl> <a class="more expanded">Minimize</a> </dd> </dl>
Technical SEO | | JustinGeeks0