Hiding Text in an SEO friendly way - is it possible?
-
Hello,
I have a client who has very little (practically no) text content on his ecommerce website, on the home page and category / sub cat pages. We have drafted some text for him - but the designer has fought back against this as he feels it will break the design.
Our proposed solution is to have some text visible - and the rest will be text that is hidden but can be revealed by clicking Read More.
We are planning to follow these recommendations : http://www.shimonsandler.com/collapsible-div-seo-friendly/
We are not hiding text for the sake of it - but more to improve the UX. We of course want the text to be accessible - i.e. readable by screen readers.
Does anyone have any experience or opinions in respect to taking this course of action, and is there anything we should make sure we either do or not do to stay on the side of the BIG G?
Kind Regs,
Rich
-
This is gold dust - THANK YOU!!
-
Thanks Michael.
I agree completely. We are just trying to find a way to tick both boxes, UX and SEO - both of which, of course, are intricately connected. So an SEO friendly text reveal function seems like a good strategy all round. We are certainly not trying to hide text from users, and include it solely for SE's. I am just keen we do it in a way that is accessible and not in breach of Google's guidelines.
I usually push my opinion through and make sure there is text on the page, even if it looks ugly in a designers opinion. Because, ultimately, a site without traffic is not worth a whole lot, even if it looks amazing!
RB
-
Michael,
These are very clear steps that could be applied by many people in various situations.
You are a great leader !
Nice work!
E
-
Hi Rich,
Here you are not hiding anything for the fact. Hiding text is something else that would involve matching the color of the text with that of the background etc. Here you are just tying to make a better UX by having the Read More button that will reveal the content. The content is very much there on the same page and your intention is very clear here. Believe me my friend, Google has mastered the art of finding out the intentions of Webmasters by looking at the page and you will not have to bother about anything in this case.
Regards,
Devanur.
-
Clear and direct. The solution is change the designer.
-
Just to add to this.
A designers job is to design for content and design to make what they are creating successful.
I would start with informing the designer of the intended goals of the site. Then have a discussion around how they feel the current design they have created is accomplishing that.
If there are any holes in the design accomplishing those goals - then a discussion can take place on how strategy, content and design can come together.
The key is to help your designer understand this and lead the team to success.
If none of that works, talk to the owner and pull rank on the designer. Clients speak and think in terms of results - so make your case.
All you can do is provide thought leadership, fight for what you believe in and don't get pushed around or marginalized for common sense recommendations.
If no one wants to listen, you've just found a client not worth working for.
(But remember, it is your thought leadership and sensitivity to everyone's role that makes or breaks it, whether it be the owner, designer, developer, etc.)
Good luck!
-
:):) well said.
-
We have drafted some text for him - but the designer has fought back against this as he feels it will break the design.
I would not be able to have this person as a designer for one of my sites.
This person is not "on board" and I don't have time to pull his teeth.
Nuf said.
-
Hi Rich,
I think just this one act of hiding text will not get you in trouble however if you combine this with other black hat techniques or your site exhibits spammy behavior then you're definitely in trouble. If one is able to access all the content in a text only browser then you should be ok. I would still try and educate the client on having a small block of introductory text above the product and category pages that would also help with conversions.
Her's the official link on hidden text by Google.
Jill Whalen's forum addresses this question here
Here's another link on this topic
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Audit my SEO Project
Hey professionals, I works on "MyInfo Community" as a SEO worker, anyone can help me to audit my this project? Because i am newbie in this field. Thanks!
Intermediate & Advanced SEO | | smartpoedgr0 -
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Seo site architecture - how deep?
Hello Moz community! We are building out a site for a web hosting/web design company. I am wondering if we should just have home/categories/pages or if we should have home/categories/sub-categories/pages. I am am not sure if by adding the additional level we can create a bunch of mini-hubs within the categories. For example: Home/Web hosting/Business Web Hosting/Small Business Web Hosting I don't know if these mini-hubs within the category are a good idea or if I should keep it as flat as possible? Any thoughts on this?
Intermediate & Advanced SEO | | YouAndWhatArmy0 -
SEO for interior page
Is it possible to be penalized on an interior page but not the whole website? Here's why I ask, I have a page: www.thesandiegocriminallawyer.com/domestic-violence.html that is not ranking well (p. 21 of Google) while the rest of the site ranks well (b/w p.1 to p.3). I checked the link profile in opensiteexplorer, ahrefs, and majesticseo but can't find any problems. I have also checked the HTML code, CSS, keyword optimization, but can't find any problems there either. Can anyone give me insight into why this might be happening? Of course, I'm working under the assumption that this page SHOULD be ranked higher for "San Diego Domestic Violence Attorney" - at least higher than page 21.
Intermediate & Advanced SEO | | mrodriguez14400 -
Google + pages and SEO results...
Hi, Can anyone give me insight into how people are getting away with naming their business by the SEO search term, creating a BS Google + page, then having that page rank high in the search results. I am speaking specifically about the results you get when you Google: "Los Angeles DUI Lawyer". As you can see from my attached screenshot (I'm doing the search in Los Angeles), the FIRST listing is a Google + business. Strangely, the phone number listed doesn't actually take you to a DUI attorney, but rather to some marketing group that never answers the phone. Can anyone give me insight into why Google even allows this? I just find it odd that Google cares so much about the user experience, but have the first result be something completely misleading. I know it sounds like I'm just jealous (which I am, a little), but I find it disheartening that we work so hard on SEO, and someone takes the top spot with an obvious BS page. UupqBU9
Intermediate & Advanced SEO | | mrodriguez14400 -
Local SEO for Pregnancy Centers?
So, the thing is, we don't want these websites associated with anything pro-life or Christian. So, we can't list them in those directories. And we can't list them in abortion provider directories because they don't do abortions. The organizaitons are Christian, pro-life -- but the target audience is the complete opposite. How can I effectively market their services without crossing any boundaries?
Intermediate & Advanced SEO | | CGR-Creative0 -
SEO from links in frames?
A site was considering linking to us. Their web page is delivered entirely via frames. Humans can see the links on the page, but it's not visible in source. I'm guessing it means Google can't detect the links, and there is no SEO effect, but I wanted to confirm. Here's the site: http://www.uofc-ulsa.tk/ Example links are the Princeton Review and Kaplan on the right sidebar. Here's the source code: view-source:http://www.uofc-ulsa.tk/ Do those links have any SEO impact?
Intermediate & Advanced SEO | | lighttable0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1