Hiding body copy with a 'read more' drop down option
-
Hi
I just want to confirm how potentially damaging using java script to hide lots of on page body copy with a 'read more' button is ?
As per other moz Q&A threads i was told that best not to use Javascript to do this & instead "if you accomplish this with CSS and collapsible/expandable <DIV> tags it's totally fine" so thats what i advised my clients dev.
However i recently noticed a big drop in rankings aprox 1 weeks after dev changing the body copy format (hiding alot of it behind a 'read more' button) so i asked them to confirm how they did implement it and they said: "done in javascript but on page load the text is defaulting to show" (which is contrary to my instructions)
So how likely is it that this is causing problems ? since coincides with ranking drop OR if text is defaulting to show it should be ok/not cause probs ?
And should i request that they redo as originally instructed (css & collapsible divs) asap ?
All Best
Dan
-
-
Hey Mick, makes good sense to do it that way so yes crazy if that has changed!!
My client scenario different in that three quarters of entire page of body copy (all well written & good quality) client wanted hidden behind a 'read more' button. Whilst im sure this will always be seen/crawled & indexed (although poss not given some of the recent comments) i think given Muellers hangout response theres a very good chance the hidden text will be seriously devalued.
Do you think advisable for me to recommend client re-show all body copy, im thinking so ?
All Best
Dan
-
I've just had fresh content crawled and indexed that is in this scenario. Basically we are saying to the visitor "if you really want to know some more boring technical information then expand this, but we don't want to spoil your experience by vomiting all the data at you at once". Crazy if that is changed.
-
agreed very worrying indeed !
let me know any findings after next crawl here & ill do the same
-
This is pretty disturbing news actually and it doesn't make any sense to me. If Google wants to promote pages with more and better quality content above the fold but also clean pages that users like - the read more buttons were the only functionality to marry both concepts.
At the moment all my pages are still fully indexed but if I see this change come into life I will have to re-think the content and layout of many pages...
-
Hi
For your info and others on this thread I have just seen this on SERT: https://www.seroundtable.com/google-index-click-to-expand-19449.html
And in the comments seen this hangout with John Mueller referenced where he says they discount non-displayed text (aprox 11 mins in): https://plus.google.com/events/cjcubhctfdmckph433d00cro9as
Having said that the client I have been looking into this for non-displayed text is indexed but then last cache date is 21st October which some people say in the thread will change after next crawl/cache.
Just wandering your (or anyone elses thoughts on this are) ?
All Best
Dan
-
Ah ha ! i see it included the full url in the link code
thanks Rafa yes i see similar flux with all my other clients now and none have dodgy links so presume just algorithmic flux, will review in a week or two
all best
dan
-
Hi Dan,
I have just clicked on the link you provided
Since the new Penguin is still rolling out and most ranking changes are at the moment down to this algo refresh I would suggest looking at your link profile for a start and if there is nothing wrong there, simply wait a couple of weeks until the refresh has officially finished and take it from there...
-
great thanks for the reassurance Mick !
-
yep, sound good.
I was working on a site last year and they switched a DNN module based on your scenario without letting me know, having already tested the existing module. First I saw was when rankings and traffic wobbled. In this case the text was lost in the javascript and accounted for about 25-30% of content on all their main pages. Nightmare!
-
grt thanks Mick
have done this now and all normally hidden body copy/content now shows so presume that means G can see it and i no longer need worry about this
-
You want Settings >> Show Advanced Settings >> (Privacy) Content Settings >> (Javascript) Do not allow any site to run javascript >> Finished.
Reload the site and check what you can see, or open up.
-
ok have done this now and all normally hidden body copy/content now shows so presume that means G can see it
-
yes, if the date of the cache is prior. So I would suggest disabling javascript in the browser reload the page and see if the expected text is displayed. If not that's what Google misses.
...and yes Google should show all the text in the cache version (text only) if the cached version is subsequent to your amendment.
-
Sorry just to confirm ....
if the body copy being displayed in GWT under the "This is how Googlebot fetched the page:" does NOT show the text that's revealed after clicking 'read more' button then thats ok since if was a prob would be listed problem such as javascripts blocked etc etc
OR
it is a problem since Googles not seeing the rest of the body copy ?
thanks
dan
-
Ok thanks Rafa that's good news
Rankings must just be just fluctuation or impact of any recent G algo updates since no other changes to site apart from the addition of some exact match anchor text links to product pages & more copy in prod descrips.
will see how next ranking report performs and look into further then if more drops or no bounce back
Re: 404 your correct how did you know without the domain part of the url ? Thanks ill tell dev
Really appreciate all your help Rafa !! thanks again !!
All Best
Dan
-
Partial doesn't necesserily means there is a problem. Check this article by Google: https://support.google.com/webmasters/answer/6066472?hl=en
If that font is the only thing not loading then it's not a problem for crawlers and it wouldn't have affected you rankings.
Btw that link to the font returns 404 error? Why are you loading fonts from a different website in the first place? Have it loaded from your site or from Google.
-
Thanks Rafa , ok done that and only listed issue is:
Googlebot couldn't get all resources for this page. Here's a list:
/fonts/glyphicons-halflings-regular.woff
so not sure if that refers to actual body copy or just some font style or similar etc etc ?
as i mentioned before the status of the fetch is 'partial' though not 'complete' so presume that means an issue, or does that just relate to 'G couldn't get all resources' ?
thanks, Dan
-
the cache version might still be of the page before they did changes to it Mick
-
Thanks Mick i searched cache:www.yoursite (clients hp url) and is showing as it shows usually, with just the first couple of paragraphs then read more button/link.
Are you saying when doing above (searching cache etc) it should show all the content as if i had clicked 'read more' button ? and if doesnt then there is an issue ?
cheers
dan
-
click on it and look at the list of issues - are there any javascripts blocked, unreachable etc.? is the preview complete or elements are missing? is render of this particular page (that lost rankings) different to other pages on your website? talk to your web developers about this and get them to fix any issues there. If there are no issues then the reason for your loss of rankings is somewhere else
-
Either switch javascript off in the browser or search cache:www.yoursite and see if you spot any content missing.
-
ok ive done that but status is saying 'partial' not 'complete' so i take it that means there is an issue ?
-
thanks Rafal will do that now
-
Collapsible divs use jquery which is a javascript. I don't think the rankings drop has got anything to do with it, unless there is an error which prevents cralwrs to access the text content. Fetch and render the page in WMT to see if there are problems.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a limit to the number of duplicate pages pointing to a rel='canonical ' primary?
We have a situation on twiends where a number of our 'dead' user pages have generated links for us over the years. Our options are to 404 them, 301 them to the home page, or just serve back the home page with a canonical tag. We've been 404'ing them for years, but i understand that we lose all the link juice from doing this. Correct me if I'm wrong? Our next plan would be to 301 them to the home page. Probably the best solution but our concern is if a user page is only temporarily down (under review, etc) it could be permanently removed from the index, or at least cached for a very long time. A final plan is to just serve back the home page on the old URL, with a canonical tag pointing to the home page URL. This is quick, retains most of the link juice, and allows the URL to become active again in future. The problem is that there could be 100,000's of these. Q1) Is it a problem to have 100,000 URLs pointing to a primary with a rel=canonical tag? (Problem for Google?) Q2) How long does it take a canonical duplicate page to become unique in the index again if the tag is removed? Will google recrawl it and add it back into the index? Do we need to use WMT to speed this process up? Thanks
On-Page Optimization | | dsumter0 -
What's the best SEO tactics when you have a dedicated web address pointing to a page on a different site?
Hope someone can help with a question I've got about sorting out some duplicate content issues. To simplify the question, imagine there is a website a.com which has a page a.com/newslettersignup. In addition to the a.com domain, there is also a different web address, ashortcut.com, which points to a.com/newslettersignup. ashortcut.com is the web address that is advertised in marketing material etc. So what is the best way then to tell Google etc. that ashortcut.com is the preferred URL for the page which sits at a.com/newslettersignup? The advice I've read about the canonical tag, for example, doesn't cover this exact scenario so although it can support cross-domain information, I'm not sure if that's the best route to follow. Thanks!
On-Page Optimization | | Nobody15755058948220 -
Single Page on my client's website is not crawling and indexing new changes. What could be the possible reason?
I made several changes on client's website on different pages, changed titles, add content on few pages, moved blog from subdomain to sub directory. Everything is crawled but there is one page on the website (not part of the blog) that isn't getting crawled in Google and picking up changes. The last crawl of the website is 2 days back whereas that page was last crawled on 30th sep. I just wanted to know the possible reasons and has anyone encountered this before?
On-Page Optimization | | MoosaHemani0 -
How can you activate the 'Results From' internal search bar on Google SERP?
Hi There, I am hoping someone can advise me on getting the 'Results From' sitelink to display for my site on the Google SERP? I have searched far and wide for the answer with no luck. I'd really appreciate your advice. Thanks! Internal_Search_Google_SERP_zps75a5383e.jpg
On-Page Optimization | | tmg.seo0 -
Not using H1's with keywords to simulate natural non SEO'd content?
There has been a lot of talk lately about making a website seem like it is not SEO'd to avoid over optimization penalties with the recent Google Algorithmic updates. Has anyone come across the practice of not using Headings (H1's, H2's etc..) properly to simulate that the current webpage isn't over optimized? I've come across a site that used to use multiple keywords within their headings & now they are using none. In fact they are marking their company name & logo as an H1 and non keyworded H2's such as our work or Contact. Is anyone holding back on their old SEO tactics to not seem over optimized to Google? Thanks!
On-Page Optimization | | DCochrane0 -
Canonical URL's - Fixed but still negatively impacted
I recently noticed that our canonical url's were not set up correctly. The incorrect setup predates me but it could have been in place for close to a year, maybe a bit more. Each of the url's had a "sortby" parameter on all of them. I had our platform provider make the fix and now everything is as it should be. I do see issues caused by this in Google Webmaster, for instance in the HTML suggestions it's telling me that pages have duplicate title tags when in fact this is the same page but with a variety of url parameters at the end of the url. To me this just highlights that there is a problem and we are being negatively impacted by the previous implementation. My question is has anyone been in this situation? Is there any way to flush this out or push Google to relook at this? Or is this a sit and be patient situation. I'm also slightly curious if Google will at some point look and see that the canonical urls were changed and then throw up a red flag even though they are finally the way they should be. Any feedback is appreciated. Thanks,
On-Page Optimization | | dgmiles
Dave0 -
Does anyone know why my Home page isn't visible in search terms?
If I type in my meta description for my homepage in google search or any of my keywords only my inner pages are returned in the results. I have a PR3 on the homepage so I don't think google is blocking my site and all my inner pages seem to show up.
On-Page Optimization | | Michael.Constable0