Hiding content until user scrolls - Will Google penalize me?
-
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections.
I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it).
Is this still the case?
Thanks,
Sam
-
Hi,
An alternative approach would be to use http://michalsnik.github.io/aos/ library. It does not set the visibility: hidden or hide the content, but uses the concept of as the element is within the viewport it will apply the animation. Make sure to test AOS library though because it does set the opacity to 0 so feel free to test in a development environment and fetch as google using Webmaster Tools.
If you don't want to use the AOSjs library you can write your own Javascript (JS) library to detect if the element is within the viewport and add the CSS class from the https://daneden.github.io/animate.css/ library as needed.
-
Interesting, far enough I suppose. Would certainly hold me back from making webpages a lot less visually appealing.
-
Thanks Kane,
Yes, this is a visual feature to appear as the user scrolls.
Would love to hear if there is a better way.
Sam
-
Hey Sam.
Is this for a visual feature, like making the content "appear" as the user scrolls? While Google is doing a great job of reading JS, my concern would be that this looks like cloaking or hidden text if the purpose is misinterpreted.
There may be safer ways to do this depending on what your goal is. Let me know and I can go from there.
-
John Mueller addressed a similar question in a recent Google Webmaster Central office-hours hangout, and he was pretty definitive. The question was about text that's hidden behind tabs. He states that they see the hidden content but won't give it as much weight.
Here's the link - https://www.youtube.com/watch?v=zZAY-BwL6rU. The question starts at 6:45.
Google does read JavaScript and CSS, and that's why they send warnings to webmasters if such files are blocked from googlebot.
-
True, but won't tell me easily if it's given less weighting.
-
Grab a few unique phrases in what is not shown immediately to the visitor, then search for it in quotes.
Should answer the question fast.
-
Is Google really that cleaver to look into my scripts folder and see that the content is actually shown on scroll, probably not, so I'm guessing as you've both suggested it may not be worth it.
I wonder if there's a better way of doing this other than using opacity.
-
This is my understanding too, Laura. It has proven frustratingly difficult to find a definitive answer to this question!
-
Google will probably index it, but it won't be given the same weight as content that's immediately visible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Progressive Enhancement Will be Helpful for SEO?
We have bundle of webpages where we load the content dynamically with the help of Ajax. Since we, need to implement Ajax crawl scheme for making Google to read those Ajax dynamic content we planned to go with hashbang URL's (!#) by creating HTMl snapshots. But last week Google withdrawn their support on crawling the Ajax crawling scheme we are planning to go with progressive enhancement approach as stated by Google in a press release. So, I just want to know what is meant by progressive enhancement and how we can implement in the case of webpages where we load the content dynamically with the help of Ajax? Please advice me on this.
Web Design | | Prabhu.Sundar1 -
Content Migration & cost of moving pages
Hope you are all having a great day! I am wondering if anyone would be able to provide general feedback. I work for a medium size company in Chicago. Currently our site is static html and we are seeking to migrate to Wordpress. After speaking with a number of website companies and receiving proposals, I am trying to understand if there is an approximate going rate or range for moving content from static html to a CMS like Wordpress? i.e. a cost per page? We don't have any dynamic content. Most of our pages are text and images. The site itself, including the blog is around 220 pages. Thanks in advance for any insight or resources!
Web Design | | SEOSponge0 -
Is it cloaking/hiding text if textual content is no longer accessible for mobile visitors on responsive webpages?
My company is implementing a responsive design for our website to better serve our mobile customers. However, when I reviewed the wireframes of the work our development company is doing, it became clear to me that, for many of our pages, large parts of the textual content on the page, and most of our sidebar links, would no longer be accessible to a visitor using a mobile device. The content will still be indexable, but hidden from users using media queries. There would be no access point for a user to view much of the content on the page that's making it rank. This is not my understanding of best practices around responsive design. My interpretation of Google's guidelines on responsive design is that all of the content is served to both users and search engines, but displayed in a more accessible way to a user depending on their mobile device. For example, Wikipedia pages have introductory content, but hide most of the detailed info in tabs. All of the information is still there and accessible to a user...but you don't have to scroll through as much to get to what you want. To me, what our development company is proposing fits the definition of cloaking and/or hiding text and links - we'd be making available different content to search engines than users, and it seems to me that there's considerable risk to their interpretation of responsive design. I'm wondering what other people in the Moz community think about this - and whether anyone out there has any experience to share about inaccessable content on responsive webpages, and the SEO impact of this. Thank you!
Web Design | | mmewdell0 -
How do SEOMOZ calculate duplicate content?
first of all i have to much duplicate stuff on my website end cleaning it up. But if i look at GWMC the duplicate stuff is a lot less than in SEOMOZ? can someone explain to me what the difference is? Thnx, Leonie.
Web Design | | JoostBruining0 -
Page Content
What is the minimum amount of content a page should have to be seo friendly? What is the maximum amount of content a page should have to be seo friendly?
Web Design | | bronxpad0 -
What's the best way to structure original vs aggregated content
We're working on a news site that has a mix of news wires such as Reuters and original opinion articles. Currently the site is setup with /world /sports etc categories with the news wire content. Now we want to add the original opinion content. Would it be better to start a new top /Opinion category and then have sub-categories for each Opinion/world, Opinion/sports subject? Or would it be better to simply add an opinion sub-category under the existing news categories, ie /world/opinion? I know Google requests that original content be in a separate directory to be considered for inclusion in Google news. Which would be better for that? Regarding link building, if the opinion sub-categories were under the top news categories, would the link juice be passed more directly than if we had a separate Opinion top category?
Web Design | | ScottDavis0 -
Duplicate content on mobile sites
Hi Guys We are launching a mobile webshop later this year and have decided to use a subdomain for this. (m.domainname.xx). The content will be more or less identical with the one on the standard desktop site (domainname.xx), but im struggeling to find out if this will create dipplicate content between the mobile and desktop site. Does anyone have a solid answer for this one?
Web Design | | AndersDK0 -
SEOMoz crawl report shows a duplicate content and duplicate title for these two url's http://freightmonster.com/ and http://freightmonster.com/index.html. How do I fix this?
What page is attached to http://freightmonster.com/ if it is not the index.html ? Should I do a redirect from the index page to something more descriptive?
Web Design | | FreightBoy1