Using a lot of "Read More" Hidden text
-
My site has a LOT of "read more" and when a user click they will see a lot of text. "read more" is dark blue bold and clear to the user. It is the perfect for the user experience, since right below I have pictures and videos which is what most users want.
Question: I expect few users will click "Read more" (however, some users will appreciate chance to read and learn more) and I wonder if search engines may think I am hiding text and this is a risky approach or simply discount the text as having zero value from an SEO perspective?
Or, equally important: If the text was NOT hidden with a "Read more" would the text actually carry more SEO value than if it is hidden under a "read more" even though users will NOT read the text anyway? If yes, reason may be: when the text is not hidden, search engines cannot see that users are not reading it and the text carry more weight from an SEO perspective than pages where text is hidden under a "Read more" where users rarely click "read more".
-
Hi khi5
I analyzed your page. You are doing just fine. you are using CSS display none. You are not doing any cloaking.
You are doing the right thing.
1. not fooling google
2.not fooling user
3.giving the user a better user experience.
Don't worry you are not applying any "black hat" technique. You will not get penalized.
-
thx. Anirban. I am not a programmer, so would you be able to tell me if this approach seems right: http://www.honoluluhi5.com/oahu/honolulu-condos/ - I don't know if css or display none.
I can't think of a better layout for that page and hiding text the way I have done it is ideal for users. If I show more text, surely bounce rate would go up!
-
It used by many huge sites is to pre-load code, navigation, or content in the background so that it can be dynamically displayed as needed. The most common technique for accomplishing this is through the use of the CSS display: none attribute.
Unfortunately, you can also use display: none to simply hide text. This is where the perceived problem comes in. People worry that the use of display: none to hide content(and show when user asks for it) or for code that is really meant for screen readers can lead them into trouble. The legitimate use of this technique is so prevalent that I would rarely expect search engines to penalize a site for using the display: none attribute. It’s just very difficult to implement an algorithm that could truly ferret out whether the particular use of display: none is meant to deceive the search engines or not.
I usually use this tactics to make the page more user friendly and it is useful for the user too. User don't get bombarded by a large content piece and I am not fooling the user/google. I am giving the option to the user to read more if he wants to.
"display: none"
What it does :- the functionality is same - when user clicks "read more" it opens and when user click "less" it closes.
How it defeats the "cloaking" idea:- When google crawls your page where the full content is there (text based browser, not java enabled) and when user sees the page there is a "read more" link and by clicking it it shows the full content. So you are not showing two different things to google & user. it solves the problem.there shouldn't be a a cloaking problem. Its tested.
Hope this helps...
Also refer :- http://moz.com/community/q/would-using-display-none-to-hide-a-section-of-text-effect-seo-negatively
-
Wow -- thanks for the links! I learn something new every day.
I'll defer to others on your specific question since I haven't ever worked with sites that specifically do what you do. I hope someone will give you a good answer!
-
http://searchengineland.com/googles-matt-cutts-on-hidden-text-using-expandable-sections-youll-be-in-good-shape-167753 - this is a more Matt Cutts video and more relevant, which again mentions it is OK to use those read more.
Again, my bigger concern is if it is OK, or I am probably safer off showing all text if possible….
-
thx, Sam. Here is a video from Matt Cutts: https://www.youtube.com/watch?v=UpK1VGJN4XY - it appears Google is OK with hidden text that makes sense for user.
For my site I have a lot of read more types like here:
http://www.honoluluhi5.com/oahu-condos/
http://www.honoluluhi5.com/oahu/honolulu-city-real-estate/As you can see from those 2 links, I have created with only the user in mind and nothing else. In order to play it safe, maybe I should just show all the text somehow, even though it compromises user experience.
-
The answer to your question lies in another question: Do search engines see one thing and users see another? If the answer is "yes," then you are using "cloaking" -- which is a very bad black-hat SEO technique. It can get you penalized and possibly banned.
Users don't see the text if they don't click "read more" but search engines will see the text either way? That's cloaking. I'd stop doing this right away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Display None (Read More) Implimentation
Hi Mozzers, This question has been asked a few times over the years, but opinion seems to have changed drastically and i wanted to get an updated opinion from sources i trust. On my category pages I have content above products. The content can push the product too far down, and if placed below is never viewed. To battle this I wanted to implement a "Read More" button so i could keep a couple hundred words there and expand it to the rest of the content if the user wanted. If not the products would remain near the top of the screen for better conversion. I have implemented this on this page to test if it affects my keyword rankings before i go site wide. But also wanted an opinion if this practice is ok. The example page with it implemented can be found here. The content im hiding isn't huge here but on other pages could be more. Is there a set ratio of text i should aim to keep / hide? Any pitfalls i should watch out for? I know google crawls the hidden content as its in the source code but should i be wary of a penalty is too much is hidden?
Intermediate & Advanced SEO | | ATP0 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
301 redirect a old site that has been "dead" for a while?
Hi guys, A quick question. I have a client who has an old business website that had some great links (Forbes.com, CocaCola.com, etc). The problem is that he knew nothing about SEO and let the hosting expire. He still owns the domain, but the site is no longer listed in Google. He did no SEO, so I am not worried about being hit by any artificial anchor text penalties, since the links are as natural as it gets. So my questions is, would there be any benefit from 301 redirecting that site to his new business? The new business is in almost exactly the same niche as the old site. I am thinking of 301'ing to a sub-page which will refer to his past venture with the old business, not to the homepage of the new site. Thanks in advance for your help.
Intermediate & Advanced SEO | | rayvensoft0 -
How to avoid too many "On Page Links"?
Hi everyone I don't seem to be able to keep big G off my back, even though I do not engage in any black hat or excessive optimization practices. Due to another unpleasant heavy SERP "fluctuation" I am in investigation mode yet again and want to take a closer look at one of the warnings within the SEOmoz dashboard, which is "Too many on page links". Looking at my statistics this is clearly the case. I wonder how you can even avoid that at times. I have a lot of information on my homepage that links out to subpages. I get the feeling that even the links within the roll-over menus (or dropdown) are counted. Of course, in that case then you will end up with a crazy amount of on page links. What about blog-like news entries on your homepage that link to other pages as well? And not to forget the links that result from the tags underneath a post? What am I trying to get at? Well, do you feel that a bad website template may cause this issue i.e. are the links from roll-over menus counted as links on the homepage even though they are not directly visible? I am not sure how to cut down on the issue as the sidebar modules are present on every page and thus up the links count wherever you are on the site. On another note, I've seen plenty of homepages with excessive information and links going out, would they be suffering from the search engines' hammer too? How do you manage the too many on page links issue? Many thanks for your input!
Intermediate & Advanced SEO | | Hermski0 -
Anyone managed to decrease the "not selected" graph in WMT?
Hi Mozzers. I am working with a very large E-com site that has a big issue with duplicate or near duplicate content. The site actually received a message in WMT listing out pages that Google deemed it should not be crawling. Many of these were the usual pagination / category sorting option URL issues etc. We have since fixed the issue with a combination of site changes, robots.txt, parameter handling and URL removals, however I was expecting the "not selected" graph in WMT to start dropping. The number of roboted pages has increased by around 1 million pages (which was expected) and indexed pages has actually increased despite removing hundreds of thousands of pages. I assume this is due to releasing some crawl bandwidth for more important pages like products. I guess my question is two-fold; 1. Is the "not selected" graph cumulative, as this would explain why it isn't dropping? 2. Has anyone managed to get this figure to significantly drop? Should I even care? I am relating this to Panda by the way. Important to note that the changes were made around 3 weeks ago and I am aware not everything will be re-crawled yet. Thanks,
Intermediate & Advanced SEO | | Further
Chris notselected.jpg0 -
Does "Noindex" lead to Loss of Link Equity?
Our company has two websites with about 8,000 duplicate articles between them. Yep, 8,000 articles were posted on both sites over the past few years. This is the definition of cross-domain duplicate content. Plan A is to set all of the articles to "noindex,follow" on the site that we care less about (site B). We are not redirecting since we want to keep the content on that site for on-site traffic to discover. If we do set them to "noindex," my concern is that we'll lose massive amounts of link equity acquired over time...and thus lose domain authority...thus overall site rankability. Does Google treat pages changed to "noindex" the same as 404 pages? If so, then I imagine we would lose massive link equity. Plan B is to just wait it out since we're migrating site B to site A in 6-9 months, and hope that our more important site (site A) doesn't get a Panda penalty in the meantime. Thoughts on the better plan?
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Pagination: rel="next" rel="prev" in ?
With Google releasing that instructional on proper pagination I finally hunkered down and put in a site change request. I wanted the rel="next" and rel="prev" implemented… and it took two weeks for the guy to get it done. Brutal and painful. When I looked at the source it turned out he put it in the body above the pagination links… which is not what I wanted. I wanted them in the . Before I respond to get it properly implemented I want a few opinions - is it okay to have the rel="next" in the body? Or is it pretty much mandatory to put it in the head? (Normally, if I had full control over this site, I would just do it myself in 2 minutes… unfortunately I don't have that luxury with this site)
Intermediate & Advanced SEO | | BeTheBoss1 -
If google ignores links from "spammy" link directories ...
Then why does SEO moz have this list: http://www.seomoz.org/dp/seo-directory ?? Included in that list are some pretty spammy looking sites such as: <colgroup><col width="345"></colgroup>
Intermediate & Advanced SEO | | adriandg
| http://www.site-sift.com/ |
| http://www.2yi.net/ |
| http://www.sevenseek.com/ |
| http://greenstalk.com/ |
| http://anthonyparsons.com/ |
| http://www.rakcha.com/ |
| http://www.goguides.org/ |
| http://gosearchbusiness.com/ |
| http://funender.com/free_link_directory/ |
| http://www.joeant.com/ |
| http://www.browse8.com/ |
| http://linkopedia.com/ |
| http://kwika.org/ |
| http://tygo.com/ |
| http://netzoning.com/ |
| http://goongee.com/ |
| http://bigall.com/ |
| http://www.incrawler.com/ |
| http://rubberstamped.org/ |
| http://lookforth.com/ |
| http://worldsiteindex.com/ |
| http://linksgiving.com/ |
| http://azoos.com/ |
| http://www.uncoverthenet.com/ |
| http://ewilla.com/ |0