Using "Read More" buttons as a tool to cram in Content
-
Hi Mozzers!
Let's say our website is clean, professional, and minimalistic.
Can we use a "read more" button that will expand the text on the page to increase the amount of content while (unless clicked) not impacting the appearance?
I want to make sure I am not violating Google Webmaster's guidelines for "Hidden Text"
Thanks!
-
I was literally about to post the same question, I've seen a fair few competitor sties doing this, not wanting to taint the design of the page, so they just add a keyword stuffed doc with <--more--> tag in the footer.
Black, grey or white on this one? I have a client that has insisted on design block heavy site which makes it very difficult to optimize for.....adding something like this could be very useful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster Tools -> Sitemap suddent "indexed" drop
Hello MOZ, We had an massive SEO drop in June due to unknown reasons and we have been trying to recover since then. I've just noticed this yesterday and I'm worried. See: http://imgur.com/xv2QgCQ Could anyone help by explaining what would cause this sudden drop and what does this drop translates to exactly? What is strange is that our index status is still strong at 310 pages, no drop there: http://imgur.com/a1sRAKo And when I do search on google site:globecar.com everything seems normal see: http://imgur.com/O7vPkqu Thanks,
Intermediate & Advanced SEO | | GlobeCar0 -
Rel="self" and what to do with it?
Hey there Mozzers, Another question about a forum issue I encountered. When a forum thread has more than just one page as we all know the best course of action is to use rel="next" rel="prev" or rel="previous" But my forum automatically creates another line in the header called Rel="self" What that does is simple. If i have 3 pages http://www.example.com/article?story=abc1
Intermediate & Advanced SEO | | Angelos_Savvaidis
http://www.example.com/article?story=abc2
http://www.example.com/article?story=abc3 **instead of this ** On the first page, http://www.example.com/article?story=abc1 On the second page, http://www.example.com/article?story=abc2 On the third page, http://www.example.com/article?story=abc3: it creates this On the first page, http://www.example.com/article?story=abc1 So as you can see it creates a url by adding the ?page=1 and names it rel=self which actually gives back a duplicate page because now instead of just http://www.example.com/article?story=abc1 I also have the same page at http://www.example.com/article?story=abc1?page=1 Do i even need rel="self"? I thought that rel="next" and rel="prev" was enough? Should I change that?0 -
Using rel="nofollow" when link has an exact match anchor but the link does add value for the user
Hi all, I am wondering what peoples thoughts are on using rel="nofollow" for a link on a page like this http://askgramps.org/9203/a-bushel-of-wheat-great-value-than-bushel-of-goldThe anchor text is "Brigham Young" and the page it's pointing to's title is Brigham Young and it goes into more detail on who he is. So it is exact match. And as we know if this page has too much exact match anchor text it is likely to be considered "over-optimized". I guess one of my questions is how much is too much exact match or partial match anchor text? I have heard ratios tossed around like for every 10 links; 7 of them should not be targeted at all while 3 out of the 10 would be okay. I know it's all about being natural and creating value but using exact match or partial match anchors can definitely create value as they are almost always highly relevant. One reason that prompted my question is I have heard that this is something Penguin 3.0 is really going look at.On the example URL I gave I want to keep that particular link as is because I think it does add value to the user experience but then I used rel="nofollow" so it doesn't pass PageRank. Anyone see a problem with doing this and/or have a different idea? An important detail is that both sites are owned by the same organization. Thanks
Intermediate & Advanced SEO | | ThridHour0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Canonical use when dynamically placing items on "all products" page
Hi all, We're trying to get our canonical situation straightened out. We have a section of our site with 100 product pages in it (in our case a city with hotels that we've reviewed), and we have a single page where we list them all out--an "all products" page called "all.html." However, because we have 100 and that's a lot for a user to see at once, we plan to first show only 50 on "all.html." When the user scrolls down to the bottom, we use AJAX to place another 50 on the page (these come from another page called "more.html" and are placed onto "all.html"). So, as you scroll down from the front end, you see "all.html" with 100 listings. We have other listings pages that are sorted and filtered subsets of this list with little or no unique content. Thus, we want to place a canonical on those pages. Question: Should the canonical point to "all.html"? Would spiders get confused, because they see that all.html is only half the listings? Is it dangerous to dynamically place content on a page that's used as a canonical? Is this a non-issue? Thanks, Tom
Intermediate & Advanced SEO | | TomNYC0 -
Best strategy for "product blocks" linking to sister site? Penguin Penalty?
Here is the scenario -- we own several different tennis based websites and want to be able to maximize traffic between them. Ideally we would have them ALL in 1 site/domain but 2 of the 3 are a partnership which we own 50% of and why are they are off as a separate domain. Big question is how do we link the "products" from the 2 different websites without looking spammy? Here is the breakdown of sites: Site1: Tennis Retail website --> about 1200 tennis products Site2: Tennis team and league management site --> about 60k unique visitors/month Site3: Tennis coaching tip website --> about 10k unique visitors/month The interesting thing was right after we launched the retail store website (site1), google was cranking up and sending upwards of 25k search impressions/day within the first 45 days. Orders kept trickling in and doing well overall for first launching. Interesting thing was Google "impressions" peaked at about 60 days post launch and then started trickling down farther and farther and now at about 3k-5k impressions/day. Many keywords phrases were originally on page 1 (position 6-10) and now on page 3-8 instead. Next step was to start putting "product links" (3 products per page) on site2 and site3 -- about 10k pages in total with about 6 links per page off to the product page (1 per product and 1 per category). We actually divided up about 100 different products to be displayed so this would mean about 2k links per product depending on the page. FYI, those original 10k pages from site2 and site3 already rank very well in Google and have been indexed for the past 2+ years in there. Most popular word on the sites is Tennis so very related. Our rationale was "all the websites are tennis related" and figured that the links on the latest and greatest products would be good for our audience. Pre-Penguin, we also figured this strategy would also help us rank for these products as well for when users are searching on them. We are thinking through since traffic and gone down and down and down from the peak of 45 days ago, that Penguin doesn't like all these links -- so what to do now? How to fix it and make the Penguin happy? Here are a couple of my thoughts on fixing it: 1. Remove the "category link" in our "product grouping" which would cut down the link by 1/3rd. 2. Place a "nofollow" on all the links for the other "product links". This would allow us to get the "user clicks" from these while the user is on that page. 3. On our homepage (site2 & site3), place 3 core products that change frequently (weekly) and showcase the latest and greatest products/deals. Thought is to NOT use the "nofollow" on these links since it is the homepage and only about 5 links overall. Heck part of me debated on taking our top 1000 pages (from the 10k page) and put the links ONLY on those and distribute about 500 products on them so this would mean only 2 links per product -- it would mean though about 4k links going there. Still thinking #2 above could be better? Any other thoughts would be great! Thanks, Jeremy
Intermediate & Advanced SEO | | jab10000 -
Using the right Schema.org - & is there a penalty in using the wrong one?
Hi We have a set of reviewed products (in this case restaurants) that total an average rating of 4.0/5.0 from 800 odd reviews. We know to use schema/restaurant for individual restaurants we promote but what about for a list of cities, say restaurants in boston for example. For the product page containing all of Boston restaurants - should we use schema.org/restaurant (but its not 1 physical restaurant) or schema.org - product + agg review score? What do you do for your product listing pages? If we get it wrong, is there a penalty? Or this just simply up to us?
Intermediate & Advanced SEO | | xoffie1 -
Fetch as GoogleBot "Unreachable Page"
Hi, We are suddenly having an error "Unreachable Page" when any page of our site is accessed as Googlebot from webmaster tools. There are no DNS errors shown in "Crawl Errors". We have two web servers named web1 and web2 which are controlled by a software load balancer HAProxy. The same network configuration has been working for over a year now and never had any GoogleBot errors before 21st of this month. We tried to check if there could be any error in sitemap, .htaccess or robots.txt by excluding the loadbalancer and pointing DNS to web1 and web2 directly and googlebot was able to access the pages properly and there was no error. But when loadbalancer was made active again by pointing the DNS to it, the "unreachable page" started appearing again. This very same configuration has been working properly for over a year till 21st of this month. Website is properly accessible from browser and there are no DNS errors either as shown by "Crawl Errors". Can you guide me about how to diagnose the issue. I've tried all sorts of combinations, even removed the firewall but no success. Is there any way to get more details about error instead of just "Unreachable Page" error ? Regards, shaz
Intermediate & Advanced SEO | | shaz_lhr0