Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Putting content behind 'view more' buttons
-
Hi
I can't find an upto date answer to this so was wondering what people's thoughts are.
Does putting content behind 'view more' css buttons affect how Google see's and ranks the data.
The content isn't put behind 'view more' to trick Google. In actual fact if you see the source of the data its all together, but its so that products appear higher up the page.
Does anyone have insight into this.
Thanks in advance
-
This technique is famously known as the toggle effect. According to Matt (in recent video- https://www.youtube.com/watch?v=EsW8E4dOtRY), It's pretty common on the web for people who want to be able to say okay click here and then show manufacturer details, show specifications, show reviews. That's a pretty normal indium at this point it's not deceptive, nobody's trying to be a manipulative. It's easy to see that this is text that's intended for users and so as long as you're doing that it should be not an issue. But certainly if you were using you know a tiny little on that users can see in there's like six pages of text area and there is not intended for users and there is keyword stuffing then that is something that Google possibly could consider hidden text. If you just doing it for users than you are in pretty good shape. Here is the ref link- http://searchengineland.com/googles-matt-cutts-on-hidden-text-using-expandable-sections-youll-be-in-good-shape-167753 Hope this will help!
-
Yes indeed, if its really a usability thing and you really help people with good content, and extra specific content under a read more that perfectly fine. But in deed be aware of the amount. If you reality punt a small line in preview and pages and pages under a read more that wouldn't be advised. but generally no worries!
-
I had worried about that too, but Matt Cutts says no, within reason. If it is a clear "read more" with a paragraph or two dropping down, that's within normal use. If you have several pages dropping down on a minimalist page, that's probably bad.
It is an old video (2011) but I haven't heard anything more recently and Google hasn't taken it down...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I redirect or add content, to 47 Pages?
We have an insurance agency website with 47 pages that have duplicate/low content warnings. What's the best way to handle this? I'm I right in thinking I have 2 options? Either add new content or redirect the page? Thanks in advance 🙂
On-Page Optimization | | laurentjb1 -
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Duplicate 'meta title' issue (AMP & NON-AMP Pages)
how to fix duplicate meta title issue in amp and non-amp pages? example.com
On-Page Optimization | | 21centuryweb
example.com/amp We have set the 'meta title' in desktop version & we don't want to change the title for AMP page as we have more than 10K pages on the website. ----As per SEMRUSH Tool---- ABOUT THIS ISSUE It is a bad idea to duplicate your title tag content in your first-level header. If your page’s <title>and <h1> tags match, the latter may appear over-optimized to search engines. Also, using the same content in titles and headers means a lost opportunity to incorporate other relevant keywords for your page.</p> <p><strong>HOW TO FIX IT</strong></p> <p>Try to create different content for your <title> and <h1> tags.<br /><br />this is what they are recommending, for the above issue we have asked our team to create unique meta and post title for desktop version but what about AMP page?<br /><br />Please help!</p></title>0 -
Thoughts on archiving content on an event site?
I have a few sites that are used exclusively to promote live events (ex. tradeshows, conference, etc). In most cases these sites content fewer than 100 pages and include information for the upcoming event with links to register. Some time after the event has ended, we would redesign the site and start promoting next years event...essentially starting over with a new site (same domain). We understand the value that many of these past event pages have for users who are looking for info from the past event and we're looking for advice on how best to archive this content to preserve for SEO. We tend to use concise urls for pages on these sites. Ex. www.event.com/agenda or www.event.com/speakers. What are your thoughts on archiving the content from these pages so we can reuse the url with content for the new event? My first thought is to put these pages into an archive, like www.event.com/2015/speakers. Is there a better way to do this to preserve the SEO value of this content?
On-Page Optimization | | accessintel0 -
Duplicate Content - Bulk analysis tool?
Hi I wondered if there's a tool to analyse duplicate content - within your own site or on external sites, but that you can upload the URL's you want to check in bulk? I used Copyscape a while ago, but don't remember this having a bulk feature? Thank you!
On-Page Optimization | | BeckyKey0 -
Duplicate page titles and Content in Woocommerce
Hi Guys, I'm new to Moz and really liking it so far!
On-Page Optimization | | jeeyer
I run a eCommerce site on Wordpress + WooCommerce and ofcourse use Yoast for SEO optimalisation I've got a question about my first Crawl report which showed over 600 issues! 😐 I've read that this is something that happens more often (http://moz.com/blog/setup-wordpress-for-seo-success). Most of them are categorized under:
1. Duplicate Page Titles or;
2. Duplicate Page Content. Duplicate Page Titles:
These are almost only: product category pages and product tags. Is this problem beeing solved by giving them the right SEO SERP? I see that a lot of categories don't have a proper SEO SERP set up in yoast! Do I need to add this to clear this issue, or do I need to change the actual Title? And how about the Product tags? Another point (bit more off-topic) I've read here: http://moz.com/community/q/yoast-seo-plugin-to-index-or-not-to-index-categories that it's advised to noindex/follow Categories and Tags but isn't that a wierd idea to do for a eCommerce site?! Duplicate Page Content:
Same goes here almost only Product Categories and product tags that are displayed as duplicate Page content! When I check the results I can click on a blue button for example "+ 17 duplicates" and that shows me (in this case 17 URLS) but they are not related to the fist in any way so not sure where to start here? Thanks for taking the time to help out!
Joost0 -
Fading in content above the fold on window load
Hi, We'd like to render a font stack from Typekit and paint a large cover image above the fold of our homepage after document completion. Since asynchronously loading anything generally looks choppy, we fade in the affected elements when it's done. Sure, it gives a much smoother feeling and fast load times, but I have a concern about SEO. While Typekit loads, h1, h2 and the page's leading paragraph are sent down the wire with an invisible style (but still technically exist as static html). Even though they appear to a user only milliseconds later, I'm concerned that a search engine's initial request is met with a page whose best descriptive assets are marked as invisible. Both UX and SEO have high value to our business model, so we're asking for some perspective to make the right kind of trade off. Our site has a high domain authority compared to our competition, and sales keyword competition is high. Will this UX improvement damage our On-Page SEO? If so and purely from an SEO perspective, roughly how serious will the impact be? We're eager to hear any advice or comments on this. Thanks a lot.
On-Page Optimization | | noyelling0 -
Is it better to drip feed content?
Hi All, I've assembled a collection of 5 closely related articles each about 700 words for publishing by linking to them from on one of my pages and would appreciate some advice on the role out of these articles. Backround: My site is a listings based site and a majority of the content is published on my competitors sites too. This is because advertisers are aiming to spread there adverts wide with the hope of generating more responses. The page I'm targeting ranks 11th but I would like to link it to some new articles and guides to beef it up a bit. My main focus is to rank better for the page that links to these articles and as a result I write up an introduction to the article/guide which serves as my unique content. Question: Is it better to drip feed the new articles onto the site or would it be best to get as much unique content on as quickly as possible to increase the ratio of unique content vs. external duplicate content on the page that links to these articles**?** Thank you in advance.
On-Page Optimization | | Mulith0