Hiding content until user scrolls - Will Google penalize me?
-
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections.
I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it).
Is this still the case?
Thanks,
Sam
-
Hi,
An alternative approach would be to use http://michalsnik.github.io/aos/ library. It does not set the visibility: hidden or hide the content, but uses the concept of as the element is within the viewport it will apply the animation. Make sure to test AOS library though because it does set the opacity to 0 so feel free to test in a development environment and fetch as google using Webmaster Tools.
If you don't want to use the AOSjs library you can write your own Javascript (JS) library to detect if the element is within the viewport and add the CSS class from the https://daneden.github.io/animate.css/ library as needed.
-
Interesting, far enough I suppose. Would certainly hold me back from making webpages a lot less visually appealing.
-
Thanks Kane,
Yes, this is a visual feature to appear as the user scrolls.
Would love to hear if there is a better way.
Sam
-
Hey Sam.
Is this for a visual feature, like making the content "appear" as the user scrolls? While Google is doing a great job of reading JS, my concern would be that this looks like cloaking or hidden text if the purpose is misinterpreted.
There may be safer ways to do this depending on what your goal is. Let me know and I can go from there.
-
John Mueller addressed a similar question in a recent Google Webmaster Central office-hours hangout, and he was pretty definitive. The question was about text that's hidden behind tabs. He states that they see the hidden content but won't give it as much weight.
Here's the link - https://www.youtube.com/watch?v=zZAY-BwL6rU. The question starts at 6:45.
Google does read JavaScript and CSS, and that's why they send warnings to webmasters if such files are blocked from googlebot.
-
True, but won't tell me easily if it's given less weighting.
-
Grab a few unique phrases in what is not shown immediately to the visitor, then search for it in quotes.
Should answer the question fast.
-
Is Google really that cleaver to look into my scripts folder and see that the content is actually shown on scroll, probably not, so I'm guessing as you've both suggested it may not be worth it.
I wonder if there's a better way of doing this other than using opacity.
-
This is my understanding too, Laura. It has proven frustratingly difficult to find a definitive answer to this question!
-
Google will probably index it, but it won't be given the same weight as content that's immediately visible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Managing Removed Content
I am a Real Estate Developer. Once a home goes off market (is sold), I had been using a 404 for that page. The problem: When the home goes up on market again, google will not re-index the new page (same URL) I have also tried to manage it a different way. Instead of removing the page, I left it as-is. At some later point time, the house goes back up on the market. The page is refreshed with new content. However, google decides to use cached version. Please note in either case, the property appears on the main page for a period of indexing. I have been doing this for 10 years, the problem is increasing with time.
Web Design | | Buckey0 -
Should i be using shortcodes for my my page content.
Hello, I have a question. Sorry if this is been answered before. Recently I decided to do a little face lift to my main website pages. I wanted to make my testimonials more pretty. Found this great plugin for testimonials which creates shortcodes. I love how it looks like, but just realised that when I use images in shortcodes, these are not picked up by search engines 😞 only text is. Image search ability is pretty important for me and I'm not sure if I should stick with my plain design and upload images manually with all alt tags and title tags or there is a way to adjust shortcode so it shows images to search engines. You can see example here. https://a-fotografy.co.uk/maternity-photographer-edinburgh/ Let me know your thoughts guys. Regards, Armands
Web Design | | A_Fotografy1 -
How Progressive Enhancement Will be Helpful for SEO?
We have bundle of webpages where we load the content dynamically with the help of Ajax. Since we, need to implement Ajax crawl scheme for making Google to read those Ajax dynamic content we planned to go with hashbang URL's (!#) by creating HTMl snapshots. But last week Google withdrawn their support on crawling the Ajax crawling scheme we are planning to go with progressive enhancement approach as stated by Google in a press release. So, I just want to know what is meant by progressive enhancement and how we can implement in the case of webpages where we load the content dynamically with the help of Ajax? Please advice me on this.
Web Design | | Prabhu.Sundar1 -
Reasons Why Our Website Pages Randomly Loads Without Content
I know this is not a marketing question but this community is very dev savvy so I'm hoping someone can help me. At random times we're finding that our website pages load without the main body content. The header, footer and navigation loads just fine. If you refresh, it's fine but that's not a solution. Happens on Chrome, IE and Firefox, testing with multiple browser versions Happens across various page types - but seems to be only the main content section/container Happens while on the company network, as well as externally Happens after deleting cookies, temporary internet files and restarting computer We are using a CMS that is virtually unheard of - Bridgeline/Iapps Codebase is .net Our IT/Dev group keeps pushing back, blaming it on cookies or Chrome plugins because they apparently are unable to "recreate the problem". This has been going on for months and it's a terrible experience for the user to have. It's also not great when landing PPC visitors on pages that load with no content. If anyone has ideas as to why this may be happening I would really appreciate it. I'm not sure if links are allowed, by today the issue happened on this page serversdirect.com/dm/geek-biz Linking to an image example below knEUzqd
Web Design | | CliqStudios0 -
URLs appear in Google Webmaster Tools that I can't find on my own site?!?
Hi, I have a Magento e-commerce site (clothing) and when I had a look through some of the sections in Google Webmaster Tools I found URLs that I can't find on my site. For example, a product url maybe http://www.example.co.uk/product-url/ which is fine. In that product there maybe three sizes of the product (Small, Medium, Large) and for some reason Googlebot is sometimes finding a url like: http://www.example.co.uk/product-url/1202/ has been found and when clicked on is a live url (Status code: 200) with is one of the sizes (medium). However I have ran a site crawl in Screaming Frog and other crawl tests and can't seem to find where Googlebot is finding these URLs. I think I need to: 1. Find how Googlebot is finding these urls? 2. Find out how to keep out of index (e.g. robots.txt, canonical etc.... Any help would be much appreciated and I'm happy to share the URL with members if they think they can have a look and help with this problem. I can share specific URLs which might make the issue seem clearer, let me know? Thanks, Darrell
Web Design | | clickyleap0 -
Do you think it will be a good idea to delete old blog pages off the server
I paid somebody to build my website using Dreamweaver, and at one point I didn't know how to use the template which automatically updates every page in the menu section so I stupidly broke the template on every new page when I made the websites blog and put the pages into a subfolder. I realised this was a silly thing to do and now and I now know how to use the template correctly I've copied every single page over from the subfolder and put it into the main template. Now I can update the template menu and every page changes automatically. The only problem is I've now got two versions of every page of my blog on the website. For some reason when I do a sitemap it comes up with a links to the old blog pages I, don't know why when I've removed the links from the blog page? and also the new copies also. I have basically got a copys of all blog pages. Do you think it will be a good idea to delete old indexed blog pages off the server so that when Google spiders the site it will pick up only the new links to the copy pages?
Web Design | | whitbycottages0 -
Getting tons of duplicate content and title errors on my asp.net shopping cart, is there way to resolve this?
The problem I am having is that the web crawlers are seeing all my category pages as the same page thus creating duplicate content and duplicate title errors. At this time I have 270 of these critical errors to deal with. Here is an example: http://www.baysidejewelry.com/category/1-necklaces.aspx http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=1 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=2 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=3 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=4 All of these pages are see as the same exact page by the crawlers. Because these pages are generated by a SQL database I don't have a way I know of to fix it.
Web Design | | bsj20020 -
What will you do with the subdomains, keep it or remove it?
A client of us has this webpage http://www.losestores.com/ not very good in design, but it has good products.
Web Design | | teconsite.com
He is having a problem with SEO, he is in page 2 for "estores" keyword. His domain authority is 27 and is upper than other domains that are ranking in page one. The question is the following: Since the webpage began to work the programmer that created it, did the following: a subdomain per section http://estores.losestores.com/ http://cortinas.losestores.com/ http://venecianas.losestores.com/ and so on. This page is not very big one. Will you remove the subdomains and do de following? www.losestores.com/estores www.losestores.com/cortinas What will be best? Thanks Victoria0