Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hiding content until user scrolls - Will Google penalize me?
-
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections.
I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it).
Is this still the case?
Thanks,
Sam
-
Hi,
An alternative approach would be to use http://michalsnik.github.io/aos/ library. It does not set the visibility: hidden or hide the content, but uses the concept of as the element is within the viewport it will apply the animation. Make sure to test AOS library though because it does set the opacity to 0 so feel free to test in a development environment and fetch as google using Webmaster Tools.
If you don't want to use the AOSjs library you can write your own Javascript (JS) library to detect if the element is within the viewport and add the CSS class from the https://daneden.github.io/animate.css/ library as needed.
-
Interesting, far enough I suppose. Would certainly hold me back from making webpages a lot less visually appealing.
-
Thanks Kane,
Yes, this is a visual feature to appear as the user scrolls.
Would love to hear if there is a better way.
Sam
-
Hey Sam.
Is this for a visual feature, like making the content "appear" as the user scrolls? While Google is doing a great job of reading JS, my concern would be that this looks like cloaking or hidden text if the purpose is misinterpreted.
There may be safer ways to do this depending on what your goal is. Let me know and I can go from there.
-
John Mueller addressed a similar question in a recent Google Webmaster Central office-hours hangout, and he was pretty definitive. The question was about text that's hidden behind tabs. He states that they see the hidden content but won't give it as much weight.
Here's the link - https://www.youtube.com/watch?v=zZAY-BwL6rU. The question starts at 6:45.
Google does read JavaScript and CSS, and that's why they send warnings to webmasters if such files are blocked from googlebot.
-
True, but won't tell me easily if it's given less weighting.
-
Grab a few unique phrases in what is not shown immediately to the visitor, then search for it in quotes.
Should answer the question fast.
-
Is Google really that cleaver to look into my scripts folder and see that the content is actually shown on scroll, probably not, so I'm guessing as you've both suggested it may not be worth it.
I wonder if there's a better way of doing this other than using opacity.
-
This is my understanding too, Laura. It has proven frustratingly difficult to find a definitive answer to this question!
-
Google will probably index it, but it won't be given the same weight as content that's immediately visible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Duplicate Content Issue: Mobile vs. Desktop View
Setting aside my personal issue with Google's favoritism for Responsive websites, which I believe doesn't always provide the best user experience, I have a question regarding duplicate content... I created a section of a Wordpress web page (using Visual Composer) that shows differently on mobile than it does on desktop view. This section has the same content for both views, but is formatted differently to give a better user experience on mobile devices. I did this by creating two different text elements, formatted differently, but containing the same content. The problem is that both sections appear in the source code of the page. According to Google, does that mean I have duplicate content on this page?
Web Design | | Dino640 -
Duplicate content on websites for multiple countries
I have a client who has a website for their U.S. based customers. They are currently adding a Canadian dealer and would like a second website with much of the same info as their current website, but with Canadian contact info etc. What is the best way to do this without creating duplicate content that will get us penalized? If we create a website at ABCcompany.com and ABCCompany.ca or something like that, will that get us around the duplicate content penalty?
Web Design | | InvoqMarketing0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Accordion Fold Ups Bad For Google
http://fandicoach.com/products Right now I have these accordion things on the website. Are they bad for google in terms of being an SEO best practice? I want to avoid doing anything black hat. Thanks!
Web Design | | OOMDODigital0 -
Does Google penalize duplicate website design?
Hello, We are very close to launching five new websites, all in the same business sector. Because we would like to keep our brand intact, we are looking to use the same design on all five websites. My question is, will Google penalize the sites if they have the same design? Thank you! Best regards,
Web Design | | Tiberiu
Tiberiu0 -
Infinite Scrolling vs. Pagination on an eCommerce Site
My company is looking at replacing our ecommerce site's paginated browsing with a Javascript infinite scroll function for when customers view internal search results--and possibly when they browse product categories also. Because our internal linking structure isn't very robust, I'm concerned that removing the pagination will make it harder to get the individual product pages to rank in the SERPs. We have over 5,000 products, and most of them are internally linked to from the browsing results pages in the category structure: e.g. Blue Widgets, Widgets Under $250, etc. I'm not too worried about removing pagination from the internal search results pages, but I'm concerned that doing the same for these category pages will result in de-linking the thousands of product pages that show up later in the browsing results and therefore won't be crawlable as internal links by the Googlebot. Does anyone have any ideas on what to do here? I'm already arguing against the infinite scroll, but we're a fairly design-driven company and any ammunition or alternatives would really help. For example, would serving a different page to the Googlebot in this case be a dangerous form of cloaking? (If the only difference is the presence of the pagination links.) Or is there any way to make rel=next and rel=prev tags work with infinite scrolling?
Web Design | | DownPour0 -
IP block in Google
Our office has a number of people performing analysis and research on keyword positions, volume, competition etc. We have 1 external static IP address. We installed the static IP so we can filter out our visits in Google Analytics. However by 10 AM we get impssible CAPTCHA's or even get blocked in Google. Do you have any experience with such an issue? Any solutions you can recommend? Any help would be appreciated! SXI5A.png
Web Design | | Partouter0