Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hiding content until user scrolls - Will Google penalize me?
-
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections.
I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it).
Is this still the case?
Thanks,
Sam
-
Hi,
An alternative approach would be to use http://michalsnik.github.io/aos/ library. It does not set the visibility: hidden or hide the content, but uses the concept of as the element is within the viewport it will apply the animation. Make sure to test AOS library though because it does set the opacity to 0 so feel free to test in a development environment and fetch as google using Webmaster Tools.
If you don't want to use the AOSjs library you can write your own Javascript (JS) library to detect if the element is within the viewport and add the CSS class from the https://daneden.github.io/animate.css/ library as needed.
-
Interesting, far enough I suppose. Would certainly hold me back from making webpages a lot less visually appealing.
-
Thanks Kane,
Yes, this is a visual feature to appear as the user scrolls.
Would love to hear if there is a better way.
Sam
-
Hey Sam.
Is this for a visual feature, like making the content "appear" as the user scrolls? While Google is doing a great job of reading JS, my concern would be that this looks like cloaking or hidden text if the purpose is misinterpreted.
There may be safer ways to do this depending on what your goal is. Let me know and I can go from there.
-
John Mueller addressed a similar question in a recent Google Webmaster Central office-hours hangout, and he was pretty definitive. The question was about text that's hidden behind tabs. He states that they see the hidden content but won't give it as much weight.
Here's the link - https://www.youtube.com/watch?v=zZAY-BwL6rU. The question starts at 6:45.
Google does read JavaScript and CSS, and that's why they send warnings to webmasters if such files are blocked from googlebot.
-
True, but won't tell me easily if it's given less weighting.
-
Grab a few unique phrases in what is not shown immediately to the visitor, then search for it in quotes.
Should answer the question fast.
-
Is Google really that cleaver to look into my scripts folder and see that the content is actually shown on scroll, probably not, so I'm guessing as you've both suggested it may not be worth it.
I wonder if there's a better way of doing this other than using opacity.
-
This is my understanding too, Laura. It has proven frustratingly difficult to find a definitive answer to this question!
-
Google will probably index it, but it won't be given the same weight as content that's immediately visible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
Do things like using labels on an element that is not a form input affect how google sees us in regards to accessibility?
Do things like using labels on an element that is not a form input affect how google sees us? It's an accessibility error that our devs have made - using a label element because it looks good, not because it's an actual label on a form field. Just wondering how that affects accessibility in Google's eyes.
Web Design | | GregLB0 -
I am Using <noscript>in All Webpage and google not Crawl my site automatically any solution</noscript>
| |
Web Design | | ahtisham2018
| | <noscript></span></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><meta http-equiv="refresh" content="0;url=errorPages/content-blocked.jsp?reason=js"></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><span class="html-tag"></noscript> | and Please tell me effect on seo or not1 -
Is it against google guidelines to use third party review sites as well as have reviews on my site marked up with schema?
So, i look after a site for my family business. We have teamed up with the third party site TrustPilot because we like the way it enables us to send out reviews to our customers directly from our system. It's been going great and some of the reviews have been brilliant. I have used a couple of these reviews on our site and marked them up with: REVIEW CONTENT We work in the service industry and so one of the problems we have found is that getting our customers to actually go online and leave a review. They normally just leave their comments on a job sheet that the workers have signed when they leave. So I have created a page on our site where we post some of the reviews the guys receive too. I have used the following: REVIEW TITLE REVIEW Written by: CUSTOMER NAME Type of Service:House Removal Date published: DATE PUBLISHED 10 / 10 stars I was just wondering I was told that this could be against googles guidelines and as i've seen a bit of a drop in our rankings in the last week or so i'm a little concerned. Is this getting me penalised? Should I not use my reviews referencing the ones on trust pilot and should i not have my own reviews page with rich snippets?
Web Design | | BearPaw881 -
Will SASS ruin my SEO?
Hello, I am thinking about using SASS for my website, striping the current CSS style sheets and translating it all to SASS.. will this hurt my SEO?
Web Design | | DanielBernhardt0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Is it cloaking/hiding text if textual content is no longer accessible for mobile visitors on responsive webpages?
My company is implementing a responsive design for our website to better serve our mobile customers. However, when I reviewed the wireframes of the work our development company is doing, it became clear to me that, for many of our pages, large parts of the textual content on the page, and most of our sidebar links, would no longer be accessible to a visitor using a mobile device. The content will still be indexable, but hidden from users using media queries. There would be no access point for a user to view much of the content on the page that's making it rank. This is not my understanding of best practices around responsive design. My interpretation of Google's guidelines on responsive design is that all of the content is served to both users and search engines, but displayed in a more accessible way to a user depending on their mobile device. For example, Wikipedia pages have introductory content, but hide most of the detailed info in tabs. All of the information is still there and accessible to a user...but you don't have to scroll through as much to get to what you want. To me, what our development company is proposing fits the definition of cloaking and/or hiding text and links - we'd be making available different content to search engines than users, and it seems to me that there's considerable risk to their interpretation of responsive design. I'm wondering what other people in the Moz community think about this - and whether anyone out there has any experience to share about inaccessable content on responsive webpages, and the SEO impact of this. Thank you!
Web Design | | mmewdell0 -
Infinite Scrolling vs. Pagination on an eCommerce Site
My company is looking at replacing our ecommerce site's paginated browsing with a Javascript infinite scroll function for when customers view internal search results--and possibly when they browse product categories also. Because our internal linking structure isn't very robust, I'm concerned that removing the pagination will make it harder to get the individual product pages to rank in the SERPs. We have over 5,000 products, and most of them are internally linked to from the browsing results pages in the category structure: e.g. Blue Widgets, Widgets Under $250, etc. I'm not too worried about removing pagination from the internal search results pages, but I'm concerned that doing the same for these category pages will result in de-linking the thousands of product pages that show up later in the browsing results and therefore won't be crawlable as internal links by the Googlebot. Does anyone have any ideas on what to do here? I'm already arguing against the infinite scroll, but we're a fairly design-driven company and any ammunition or alternatives would really help. For example, would serving a different page to the Googlebot in this case be a dangerous form of cloaking? (If the only difference is the presence of the pagination links.) Or is there any way to make rel=next and rel=prev tags work with infinite scrolling?
Web Design | | DownPour0