Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does collapsing content impact Google SEO signals?
-
Recently I have been promoting custom long form content development for major brand clients. For UX reasons we collapse the content so only 2-3 sentences of the first paragraph are visible. However there is a "read more" link that expands the entire content piece.
I have believed that the searchbots would have no problem crawling, indexing and applying a positive SEO signal for this content. However I'm starting to wonder. Is there any evidence that the Google search algorithm could possible discount or even ignore collapsed content? -
Thanks EGOL. Still looking for additional evidence about this.
-
well.. yup. I know many SEOs that do think that the collapsable are is just not important enough for google to consider it

good luck
-
If I see a study, I'll post a link here.
-
Yep I completely agree with your response. Unfortunately I'm in a position where I manage major enterprise accounts with multiple stakeholders (including some people are not educated in SEO). Every major change we propose needs to be documented, cited and reviewed. When making an argument for content expansion I would need to use thorough research example (Moz study, documentation on search engine land, etc).
Anyway thank for taking the time to share your feedback and advice on this thread. Although this is not the answer I wanted to hear (i.e. Google doesn't respect collapsed content)...however it's very likely accurate. This is a serious SEO issue that needs to be addressed.
-
Are there any case studies about this issue?
Just the one that I published above. The conclusion is... be prepared to sacrifice 80% of your traffic if you hide your valuable content behind a preview.
I would be asking the UX people to furnish studies that hiding content produces better sales.
We have lots of people raving about the abundance of content on our site, the detailed product descriptions, how much help we give them to decide what to purchase. All of this content is why we dominate the SERPs in our niche and that, in many people's eyes, is a sign of credibility. Lots of people say... "we bought from you because your website is so helpful". However, if we didn't have all of this content in the open these same people would have never even found us.
Nobody has to read this stuff. I would rather land on a website and see my options than land on a website and assume that they was no information because I didn't notice that the links to open it were in faded microfont because the UX guys wanted things to be tidy. I believe that it is a bigger sin to have fantastic content behind a clickthorugh than it is to put valuable information in the open and allow people to have the opportunity to read it.
Putting our content out in the open is what makes our reputation.
I sure am glad that I am the boss here. I can make the decisions and be paid on the basis of my performance.

-
We are applying 500 to 800+ word custom content blocks for our client landing pages (local landing pages) that shows a preview of the first paragraph and a "read more" expansion link. We know that most website visitors only care about the location info of these particular landing pages. We also know that our client UX teams would certainly not approve an entire visible content block on these pages.
Are there any case studies about this issue? I'm trying to find a bona fide research project to help back up our argument. -
It was similar to a Q&A. There was a single sentence question and a paragraph of hidden answer. This page had a LOT of questions and a tremendous amount of keywords in the hidden content. Thousands of words.
The long tail traffic tanked. Then, when we opened the content again the traffic took months to start coming back. The main keywords held in the SERPs. The longtail accounted for the 80% loss.
-
How collapsed was your content? Did you hide the entire block? Only show a few sentences? I'm trying to find a research article about this. This is a MAJOR issue to consider for our SEO campaigns.
-
Yes that is a very legitimate concern of mine. We have invested significant resources into custom long form content for our clients and we are very concerned this all for nothing...or possibly worse (discounting content).
-
Recently i a had related issue with a top ranking website for very competitive queries.
Unfortunately the product department made some changes to the content (UI only) without consulting SEO department. The only worth to mention change they made was to move the first two paragraphs into a collapsible DIV showing only the first 3 lines + a "read more" button. The text in collapsible div was crawlable and visible to SE's. (also it's worth to mention that these paragrap
But the site lost its major keywords positions 2-3 days later.Of-course we reverted the changes back but still two months later, the keywords are very slowly moving back to their "original" positions.
For years i believed in what Google stated, that you can use collapsible content if you are not trying to inject keywords or trying to inflate the amount of content etc. Not anymore.
I believe that placing the content under a collapsible div element, we are actually signaling google that this piece of content is not that important (that's why it is hidden, right? Otherwise it should be in plain sight). So why we should expect from google to take this content as a major part of our contents ranking factor weight.
-
About two years ago I had collapsed content on some important pages. Their longtail traffic went into a steady slide, but the head traffic held. I attribute this to a sign that the collapsed content was discounted, removing it from, or lowering its ability to count in the rankings for long tail queries.
I expanded the page, making all content visible. A few months later, longtail traffic started to slowly rise. It took many months to climb back to previous levels.
After this, every word of my content is now in the open.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Pagination Changes
What with Google recently coming out and saying they're basically ignoring paginated pages, I'm considering the link structure of our new, sooner to launch ecommerce site (moving from an old site to a new one with identical URL structure less a few 404s). Currently our new site shows 20 products per page but with this change by Google it means that any products on pages 2, 3 and so on will suffer because google treats it like an entirely separate page as opposed to an extension of the first. The way I see it I have one option: Show every product in each category on page 1. I have Lazy Load installed on our new website so it will only load the screen a user can see and as they scroll down it loads more products, but how will google interpret this? Will Google simply see all 50-300 products per category and give the site a bad page load score because it doesn't know the Lazy Load is in place? Or will it know and account for it? Is there anything I'm missing?
Intermediate & Advanced SEO | | moon-boots0 -
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Membership/subscriber (/customer) only content and SEO best practice
Hello Mozzers, I was wondering whether there's any best practice guidance out there re: how to deal with membership/subscriber (existing customer) only content on a website, from an SEO perspective - what is best practice? A few SEOs have told me to make some of the content visible to Google, for SEO purposes, yet I'm really not sure whether this is acceptable / manipulative, and I don't want to upset Google (or users for that matter!) Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
How does google recognize original content?
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website. However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content. I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic. What do you guys think? You think all our original content effort is going to trash?
Intermediate & Advanced SEO | | serkie1 -
Number of images on Google?
Hello here, In the past I was able to find out pretty easily how many images from my website are indexed by Google and inside the Google image search index. But as today looks like Google is not giving you any numbers, it just lists the indexed images. I use the advanced image search, by defining my domain name for the "site or domain" field: http://www.google.com/advanced_image_search and then Google returns all the images coming from my website. Is there any way to know the actual number of images indexed? Any ideas are very welcome! Thank you in advance.
Intermediate & Advanced SEO | | fablau1 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1