Does collapsing content impact Google SEO signals?
-
Recently I have been promoting custom long form content development for major brand clients. For UX reasons we collapse the content so only 2-3 sentences of the first paragraph are visible. However there is a "read more" link that expands the entire content piece.
I have believed that the searchbots would have no problem crawling, indexing and applying a positive SEO signal for this content. However I'm starting to wonder. Is there any evidence that the Google search algorithm could possible discount or even ignore collapsed content? -
Thanks EGOL. Still looking for additional evidence about this.
-
well.. yup. I know many SEOs that do think that the collapsable are is just not important enough for google to consider it
good luck
-
If I see a study, I'll post a link here.
-
Yep I completely agree with your response. Unfortunately I'm in a position where I manage major enterprise accounts with multiple stakeholders (including some people are not educated in SEO). Every major change we propose needs to be documented, cited and reviewed. When making an argument for content expansion I would need to use thorough research example (Moz study, documentation on search engine land, etc).
Anyway thank for taking the time to share your feedback and advice on this thread. Although this is not the answer I wanted to hear (i.e. Google doesn't respect collapsed content)...however it's very likely accurate. This is a serious SEO issue that needs to be addressed.
-
Are there any case studies about this issue?
Just the one that I published above. The conclusion is... be prepared to sacrifice 80% of your traffic if you hide your valuable content behind a preview.
I would be asking the UX people to furnish studies that hiding content produces better sales.
We have lots of people raving about the abundance of content on our site, the detailed product descriptions, how much help we give them to decide what to purchase. All of this content is why we dominate the SERPs in our niche and that, in many people's eyes, is a sign of credibility. Lots of people say... "we bought from you because your website is so helpful". However, if we didn't have all of this content in the open these same people would have never even found us.
Nobody has to read this stuff. I would rather land on a website and see my options than land on a website and assume that they was no information because I didn't notice that the links to open it were in faded microfont because the UX guys wanted things to be tidy. I believe that it is a bigger sin to have fantastic content behind a clickthorugh than it is to put valuable information in the open and allow people to have the opportunity to read it.
Putting our content out in the open is what makes our reputation.
I sure am glad that I am the boss here. I can make the decisions and be paid on the basis of my performance.
-
We are applying 500 to 800+ word custom content blocks for our client landing pages (local landing pages) that shows a preview of the first paragraph and a "read more" expansion link. We know that most website visitors only care about the location info of these particular landing pages. We also know that our client UX teams would certainly not approve an entire visible content block on these pages.
Are there any case studies about this issue? I'm trying to find a bona fide research project to help back up our argument. -
It was similar to a Q&A. There was a single sentence question and a paragraph of hidden answer. This page had a LOT of questions and a tremendous amount of keywords in the hidden content. Thousands of words.
The long tail traffic tanked. Then, when we opened the content again the traffic took months to start coming back. The main keywords held in the SERPs. The longtail accounted for the 80% loss.
-
How collapsed was your content? Did you hide the entire block? Only show a few sentences? I'm trying to find a research article about this. This is a MAJOR issue to consider for our SEO campaigns.
-
Yes that is a very legitimate concern of mine. We have invested significant resources into custom long form content for our clients and we are very concerned this all for nothing...or possibly worse (discounting content).
-
Recently i a had related issue with a top ranking website for very competitive queries.
Unfortunately the product department made some changes to the content (UI only) without consulting SEO department. The only worth to mention change they made was to move the first two paragraphs into a collapsible DIV showing only the first 3 lines + a "read more" button. The text in collapsible div was crawlable and visible to SE's. (also it's worth to mention that these paragrap
But the site lost its major keywords positions 2-3 days later.Of-course we reverted the changes back but still two months later, the keywords are very slowly moving back to their "original" positions.
For years i believed in what Google stated, that you can use collapsible content if you are not trying to inject keywords or trying to inflate the amount of content etc. Not anymore.
I believe that placing the content under a collapsible div element, we are actually signaling google that this piece of content is not that important (that's why it is hidden, right? Otherwise it should be in plain sight). So why we should expect from google to take this content as a major part of our contents ranking factor weight.
-
About two years ago I had collapsed content on some important pages. Their longtail traffic went into a steady slide, but the head traffic held. I attribute this to a sign that the collapsed content was discounted, removing it from, or lowering its ability to count in the rankings for long tail queries.
I expanded the page, making all content visible. A few months later, longtail traffic started to slowly rise. It took many months to climb back to previous levels.
After this, every word of my content is now in the open.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Docs
Hi Mozers, I was wondering what do you guys think about indexing Google Docs files as Documents or Spreadsheets? Can you do that and is it any help if you what to get some content on the firs page of Google. And also can Google see that content and links, because when I deactivate the javascript on chrome I couldn't see anything from the content Thanks
Intermediate & Advanced SEO | | VeeamSoftware0 -
SEO: How to change page content + shift its original content to other page at the same time?
Hello, I want to replace the content of one page of our website (already indexeed) and shift its original content to another page. How can I do this without problems like penalizations etc? Current situation: Page A
Intermediate & Advanced SEO | | daimpa
URL: example.com/formula-1
Content: ContentPageA Desired situation: Page A
URL: example.com/formula-1
Content: NEW CONTENT! Page B
URL: example.com/formula-1-news
Content: ContentPageA (The content that was in Page A!) Content of the two pages will be about the same argument (& same keyword) but non-duplicate. The new content in page A is more optimized for search engines. How long will it take for the page to rank better?0 -
Looking for SEO advice on Negative SEO attack. Technical SEO
please see this link https://www.dropbox.com/s/thgy57zmmwzodcp/Screenshot 2016-05-31 13.25.23.png?dl=0 you can see my domain is getting tons of chinese spam. I have 410'd the page but it still keeps coming.. 7tnawRV
Intermediate & Advanced SEO | | mattguitar990 -
Will Google View Using Google Translate As Duplicate?
If I have a page in English, which exist on 100 other websites, we have a case where my website has duplicate content. What if I use Google Translate to translate the page from English to Japanese, as the only website doing this translation will my page get credit for producing original content? Or, will Google view my page as duplicate content, because Google can tell it is translated from an original English page, which runs on 100+ different websites, since Google Translate is Google's own software?
Intermediate & Advanced SEO | | khi50 -
Can a website be punished by panda if content scrapers have duplicated content?
I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening? I'd really appreciate any help as I can't find the answer online!
Intermediate & Advanced SEO | | RG_SEO0 -
SEO Tools for Content Audit
Hi i'm looking for a tool which can do a full content audit for a site for instance - Find pages which: • Lack text content. • Finds pages with lengthy meta descriptions • Finds missing H1 tags or multiple H1 tags . • Duplicate meta descriptions. • Find images with no alt text Are there any tools besides the ones on SEMOZ which can enable me to do a full content audit on factors like these. Or any SEO audit tools out there which you can recommend. Cheers, Mark
Intermediate & Advanced SEO | | monster990 -
SEO-Friendly Method to Load XML Content onto Page
I have a client who has about 100 portfolio entries, each with its own HTML page. Those pages aren't getting indexed because of the way the main portfolio menu page works: It uses javascript to load the list of portfolio entries from an XML file along with metadata about each entry. Because it uses javascript, crawlers aren't seeing anything on the portfolio menu page. Here's a sample of the javascript used, this is one of many more lines of code: // load project xml try{ var req = new Request({ method: 'get', url: '/data/projects.xml', Normally I'd have them just manually add entries to the portfolio menu page, but part of the metadata that's getting loaded is project characteristics that are used to filter which portfolio entries are shown on page, such as client type (government, education, industrial, residential, industrial, etc.) and project type (depending on type of service that was provided). It's similar to filtering you'd see on an e-commerce site. This has to stay, so the page needs to remain dynamic. I'm trying to summarize the alternate methods they could use to load that content onto the page instead of javascript (I assume that server side solutions are the only ones I'd want, unless there's another option I'm unaware of). I'm aware that PHP could probably load all of their portfolio entries in the XML file on the server side. I'd like to get some recommendations on other possible solutions. Please feel free to ask any clarifying questions. Thanks!
Intermediate & Advanced SEO | | KaneJamison0 -
Will blocking google and SE's from indexing images hurt SEO?
Hi, We have a bit of a problem where on a website we are managing, there are thousands of "Dynamically" re-sized images. These are stressing out the server as on any page there could be upto 100 dynamically re-sized images. Google alone is indexing 50,000 pages a day, so multiply that by the number of images and it is a huge drag on the server. I was wondering if it maybe an idea to blog Robots (in robots.txt) from indexing all the images in the image file, to reduce the server load until we have a proper fix in place. We don't get any real value from having our website images in "Google Images" so I am wondering if this could be a safe way of reducing server load? Are there any other potential SEO issues this could cause?? Thanks
Intermediate & Advanced SEO | | James770