How to check readability in testing mode
-
Hi,
Is there a way to test if my content is readable while it is still in testing mode (meaning there obviously wont be a cache)
Thanks!
-
Well, text on a page is going to be readable by Google unless you hide it in some way. You can check by having a look at the Lynx text viewer:
-
That is a cool tool that tells you about the quality of your website text, but I want to know if specific text on my page is readable by Google.
-
That is a cool tool that tells you about the quality of your website text, but I want to know if specific text on my page is readable by Google.
-
I use this site when testing content:
http://www.readability-score.com/
You do have to copy and paste it in, but I like this because you can type it in on the fly and watch the score.
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Has anyone tested or knows whether it makes a difference to upload a disavow file to both www. and non-www. versions of your site in GWMT?
Although Google treats both as separate sites, I always assumed that uploading the disavow file to the canonical version of your site would solve the problem. Is this the case, or has anyone seen better results uploading to both versions?
Technical SEO | | CustardOnlineMarketing0 -
Google Tag Manager - Debug Mode
Hi I am in the process of implementing Tag Manager across one of the sites that I manage. But it is a very large site so we have taken the decision to implement it on a number of important pages first and then update the rest of the site incrementally. The pages that we have chosen to implement GTM on are confirmation pages as we use double click floodlight tags to track a lot of our conversions. However when I go into debug mode on tag manager I can not seem to get the screen that lets you know if the tag is firing or not. Has anyone else had this problem? Could this be because I don't have GTM code on the homepage of the site? When I implemented Tag manager on other sites that I managed I had no such problems. Any help would be greatly appreciated. Thanks
Technical SEO | | cbarron0 -
Blocking Test Pages Enmasse on Sub-domain
Hello, We have thousands of test pages on a sub-domain of our site. Unfortunately at some point, these pages were visible to search engines and got indexed. Subsequently, we made a change to the robots.txt file for the test sub-domain. Gradually, over a period of a few weeks, the impressions and clicks as reported by Google Webmaster Tools fell off for the test. sub-domain. We are not able to implement the no index tag in the head section of the pages given the limitations of our CMS. Would blocking off Google bot via the firewall enmasse for all the test pages have any negative consequences for the main domain that houses the real live content for our sites (which we would like to of course remain in the Google index). Many thanks
Technical SEO | | CeeC-Blogger0 -
HELP! Google Business listing removed - check out my SERP free fall!
Hey Moz Community, I need your help. I have (had) a google local business listing. I made some changes last week to clean it up. Now it's gone and I've had a massive fall in my rankings (see images). Changes I made: I changed the business name from a spammy name: BYOL Graphic Design & Web Design Courses. TO: **Bring Your Own Laptop PTY LTD. ** I added Local business schema to my website. I added my telephone, address etc. Previously it was all an image. I tinkered with a few other things while I was there too but I can't remember them all. Other notes: Initially I blamed the dramatic drop on Hummingbird but after doing some research I'm not so sure. I think it's because I was messing around with Google Local. My listing is/was here: https://plus.google.com/111689574762050943425 (though it doesn't work any more). My website is www.bringyourownlaptop.com.au Questions: Do you think the loss of my Google local listing could be the cause of such a big drop? If it is my Google local listing - Is there a way to find out what I did wrong? Is there a place to ask for it to be re-checked? Is it better to just start again with a new listing? Do you think it's tied together with a Hummingbird change some how? What else can I check? What would you do? Thanks for your time. Dan
Technical SEO | | danlovesadobe
_The nervous business owner. _ InKrwvX.png Hv36XUu.png0 -
Am I doing SEO test properly?
Hello, I just created a page for researching the impact of social signals on Google ranking (in Italy). Page was not optimized (one internal backlink, no other external/internal links, keyword repeated 4 or 5 + h1 h2, no alt tags), and only social signals are being stimulated (through votes). The domain is 2 months old and is already positioned for few relevant keywords, but from 2 page down. My question is: am I doing right? Is this a good way to proceed? And if not, what I should do instead? Thank you for an advice. Eugenio
Technical SEO | | socialengaged0 -
How to test a geo tagged homepage?
The e-commerance system we have has a geo tagged hompage system so you can set up different homepages based on the user country IP. But I want to test what the default homepage is, if the system can not get the user IP, does anyone know of a way to do this? Also does googles bot, not give an IP for this, or is it always an American IP (even if your site is set to a different country)? Thanks
Technical SEO | | PaddyDisplays0 -
Testing for duplicate content and title tags
Hi there, I have been getting both Duplicate Page content and Duplicate Title content warnings on my crawl diagnostics report for one of my campaigns. I did my research, and implemented the preferred domain setting in Webmaster Tools. This did not resolve the crawl diagnostics warnings, and upon further research I discovered the preferred domain would only be noted by Google and not other bots like Roger. My only issue was that when I ran an SEOmoz crawl test on the same domain, I saw none of the duplicate content or title warnings yet they still appear on my crawl diagnostics report. I have now implemented a fix in my .htaccess file to 301 redirect to the www. domain. I want to check if it's worked, but since the crawl test did not show the issue last time I don't think I can rely on that. Can you help please? Thanks, Claire
Technical SEO | | SEOvet0 -
False Negative Warnings with Crawl Diagnostic Test
Ok... I will try to explain as clear as possible. This issue is regarding close to 5000 'Warnings' from our most recent seomoz pro crawl diagnostic test. The top three warnings have about 6000 instances among them: : 1. Duplicate Page Title 2. Duplicate Page Content 3. 302 (Temporary Redirect) We understand that duplicate titles and content are "no-no's" and have made it top priority to avoid duplication on any level. Here is the issue lies... we are using the Volusion eCommerce solution and they have a variety of value add shopping features such as "Email A Friend" and "Email Me When Back In-Stock" on each product page. If one of these options is clicked, you are then directed to the appropriate page. Now each page has a different url with the sole variable of each individual product code. But with it being a part of Volusion's ingrained functionality... the META title is the same for each page. It takes from the title of our store homepage. Example below: Online Beauty Supply Store | Hair Care Products | Nail Care | Flat Irons http://www.beautystoponline.com/Email_Me_When_Back_In_Stock.asp?ProductCode=AN1PRO7130 Online Beauty Supply Store | Hair Care Products | Nail Care | Flat Irons http://www.beautystoponline.com/Email_Me_When_Back_In_Stock.asp?ProductCode=BI8BIOSI34 The same goes for the duplicate content warnings. If you click on one of these features, it directs you to a page with pretty much the same content except for different product. Basically each page has both duplicate content and duplicate title. SEOMOZ description is Duplicate Title: Content that is identical (or nearly identical) to content on other pages of your site forces your pages to unnecessarily compete with each other for rankings. Duplicate Page Content: You should use unique titles for your different pages to ensure that they describe each page uniquely and don't compete with each other for keyword relevance. Because I know SEO is not an exact science, the question here is does Google recognize that although they are duplicates, it actually is generated from a feature that makes us even more of a legitimate eCommerce site? Or, from seomoz description, if duplication is bad only because you do not want your pages to be competing with each other... should I not worry because i could care less if these pages don't get traffic. Or does it effect my domain authority as whole? Then as for a solution. I am still trying to work out with Volusion how we can change the META title of the pages. It's highly unlikely but we'll see. As for the duplicate content, there is no way to change one of these pages. It's hard coded. Solution... so if it is bad (even though it shouldn't be) would it be worth it to disable these features. I hope not. Wouldn't that defeat the purpose of Google trying to provide the most legitimate, value add sites to searchers? As for the 302 (Temporary Redirect) warning... this is only appearing on all of our shopping cart pages. Such as the "Email A Friend" feature, there is a page for every product. For example: http://www.beautystoponline.com/ShoppingCart.asp?ProductCode=AN1HOM8040 http://www.beautystoponline.com/ShoppingCart.asp?ProductCode=AN1HOM8050 The description semoz provides is: 302 (Temporary Redirect): Using a 302 redirect will cause search engine crawlers to treat the redirect as temporary and not pass any link juice (ranking power). We highly recommend that you replace 302 redirects with 301 redirects. So the probably solution... I do have the ability to change to a 301 redirect but do I want to do this for my shopping cart? Does Google realize the dead end is legitimate? Or... does it matter if link juice is passed through my shopping cart? And again, does it impact my site as a whole? It is greatly appreciated if anyone could help me out with this stuff 🙂 Thank you
Technical SEO | | anthonyjamesent1