Does content revealed by a 'show more' button get crawled by Google?
-
I have a div on my website with around 500 words of unique content in, automatically when the page is first visited the div has a fixed height of 100px, showing a couple of hundred words and fading out to white, with a show more button, which when clicked, increases the height to show the full content.
My question is, does Google crawl the content in that div when it renders the page? Or disregard it? Its all in the source code.
Or worse, do they consider this cloaking or hidden content?
It is only there to make the site more useable for customers, so i don't want to get penalised for it.
Cheers
-
Neil, the others are right--you should first show the full content and not hide any of the content on the page like you're doing. Depending on the size of the content, though, you might consider why you're hiding the content in the first place, as you might need to create more pages on your site for that content. Adding the content to new pages on the site might be good for your users, and certainly will fix your problem.
When considering the content and indexing, though, if the content is in the page source code then it will be indexed. Google does know if it's hidden, though, as Googlebot, Google's crawler, is essentially a version of Google Chrome.
-
this is one of the few things where google has a pretty clear statement:
"If you think a content is relevant to your users you should always make it clearly visible"
If you think about that it makes complete sense, if someone searches for a content and clicks on a result they expect to see that text, if that is hidden somewhere they won't consider that result relevant for their search, and that's what google do not want to happen.
I have to agree that the 500 words content still works for the long tail, so I would say, keep your important content at the top of the page and reference other supplementary content at the bottom or at the side but always try to make it visible.
You can see Google standing on Barry Schwartz latest article on google discounting tabbed content
As an addiitonal thing it's totally safe to hide some content on your mobile version if you've a responsive website for improving user experience, as far as that content is clearly shown in your desktop version.
hoep this helps
-
Google will crawl that content, but it will be devalued greatly. Any content that you find valuable to your visitors should be readily available, and preferably above the fold (which of course is not always viable). Amazon and REI both do a great job with this, the content appears to be tabbed (for reviews, descriptions, Q&A, etc), but when you click the link, it takes you further down the page via anchors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Rotating content = Google Penalty?
Hi all. We have an ecommerce site which features various product sections. In each section you might have 60 products each displayed neatly in pages of 10. We recently added functionality, so that if a product is out of stock, it will automatically drop that product to the back of the list and bring another in stock one forward. We're just worried that Google will view the same information, repeatedly rotating on the first page of 10 products (the page that ranks) and think we're in some way trying to trick Google into thinking the content is fresh? Does anyone have a throw on this? Is it likely to penalise us? Thank you!!! Ben
Intermediate & Advanced SEO | | bnknowles10 -
What can you do when Google can't decide which of two pages is the better search result
On one of our primary keywords Google is swapping out (about every other week) returning our home page, which is more transactional, with a deeper more information based page. So if you look at the Analysis in Moz you get an almost double helix like graph of those pages repeatedly swapping places. So there seems to be a bit of cannibalizing happening that I don't know how to correct. I think part of the problem is the deeper page would ideally be "longer" tail searches that contain the one word keyword that is having this bouncing problem as a part of the longer phrase. What can be done to try prevent this from happening? Can internal links help? I tried adding a link on that term to the deeper page to our homepage, and in a knee jerk reaction was asked to pull that link before I think there was really any evidence to suggest that that one new link made a positive or negative effect. There are some crazy theories floating around at the moment, but I am curious what others think both about if adding a link from a informational to a transactional page could in fact have a negative effect, and what else could be done/tried to help clarify the difference between the two pages for the search engines.
Intermediate & Advanced SEO | | plumvoice0 -
URL Parameter Being Improperly Crawled & Indexed by Google
Hi All, We just discovered that Google is indexing a subset of our URL’s embedded with our analytics tracking parameter. For the search “dresses” we are appearing in position 11 (page 2, rank 1) with the following URL: www.anthropologie.com/anthro/category/dresses/clothes-dresses.jsp?cm_mmc=Email--Anthro_12--070612_Dress_Anthro-_-shop You’ll note that “cm_mmc=Email” is appended. This is causing our analytics (CoreMetrics) to mis-attribute this traffic and revenue to Email vs. SEO. A few questions: 1) Why is this happening? This is an email from June 2012 and we don’t have an email specific landing page embedded with this parameter. Somehow Google found and indexed this page with these tracking parameters. Has anyone else seen something similar happening?
Intermediate & Advanced SEO | | kevin_reyes
2) What is the recommended method of “politely” telling Google to index the version without the tracking parameters? Some thoughts on this:
a. Implement a self-referencing canonical on the page.
- This is done, but we have some technical issues with the canonical due to our ecommerce platform (ATG). Even though page source code looks correct, Googlebot is seeing the canonical with a JSession ID.
b. Resubmit both URL’s in WMT Fetch feature hoping that Google recognizes the canonical.
- We did this, but given the canonical issue it won’t be effective until we can fix it.
c. URL handling change in WMT
- We made this change, but it didn’t seem to fix the problem
d. 301 or No Index the version with the email tracking parameters
- This seems drastic and I’m concerned that we’d lose ranking on this very strategic keyword Thoughts? Thanks in advance, Kevin0 -
Google Crawl Rate and Cached version - not updated yet :(
Hi, Ive noticed that Google is not recognizing/crawling the latest changes on pages in my site - last update when viewing Cached version in Google Results is over 2 months ago. So, do I Fetch as Googlebot to force an update? Or do I remove the page's cached version in GWT remove urls? Thanks, B
Intermediate & Advanced SEO | | bjs20100 -
When Google's WMT shows thousands of links from a single domain... Should they be removed?
Hi, Looking at Google's WMT "links to your site" it shows few sites that have thousands of links pointing to mine. There are actually only 1-2 links pointing to me from a site that Google shows 2000.
Intermediate & Advanced SEO | | BeytzNet
I assume that it is simply because they don't have canonical tags. Should I ask for the 2 links to be removed? Thanks0 -
Google Algo update for over SEO'd sites: Is this a game changer?
This must be on the forum somewhere already but I cant find it. Google are updating there algo to penalise over SEO'd sites, is this a game changer? http://www.pcpro.co.uk/news/373630/google-to-demote-seo-heavy-sites Cheers
Intermediate & Advanced SEO | | activitysuper0 -
NOINDEX content still showing in SERPS after 2 months
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content: name="robots" content="NOINDEX" /> It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options. 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right? Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden. Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code. Please advise and thanks for reading.
Intermediate & Advanced SEO | | NormanNewsome0