Does the use of sliders for text-on-page, effects SEO in any way?
-
The concept of using text sliders in an e-commerce site as a solution to placing SEO text above or in between product and high on ages, seems too good to be true.... or is it?
How would a text slider for FAQ or other on-page text done with sliding paragraphs (similar but not this specific code- http://demo.tutorialzine.com/2010/08/dynamic-faq-jquery-yql-google-docs/faq.html) might effect text-on-page SEO. Does Google consider it hidden text?
Would there be any other concerns or best practices with this design concept?
-
Fredrik,
This is very helpful and gives me a clearer understanding as to how to make this work properly. The example was just that, and meant to explain basic functionality. We'll make sure we end up using an index-able HTML based version.
Much thanks for your advise.
ron
-
Hi Ron
As Paul stated there are many ways of doing sliders. Most of the new sliders out there do work with JavaScript but often used already loaded dom elements for the slides. That means that the actual content is in the HTML and the JavaScript is used to animate or style them. This content would then be indexed just as a normal div would.
You can also use http://www.seobrowser.com/, (simple option is free) to see the page as Google would see it. If you then can read your content it should be possible to index it.
One thing to think of is that sliders, as the name implies, often contains more than one slide. If the slider has a heading in it it might be a good thing to make the first heading H1 and secondary sliders H2. This way you can place your most important content in the first slide.
Not sure if you use Jquery but if you do http://jquerytools.org/ offer great power and flexibility. Please note that I am NOT connected to them or work for them. We have just used their scripts on variious of our projects.
I had a quick look at your example and unfortunetely that would have a very hard time getting indexed since content is in the javascript. I would consider putting all content in the HTML and then just hide and show sections using Jquery instead.
Have a great day and good luck
Fredrik
-
Hi Paul,
Thank you so much for the detailed answer. deep down i worried this might be the case.
The truth is that the text in question is pretty much for SEO reasons only. Do you know f a better way, or another kind of script that would serve to have the text indexed?
Ron
-
The answer is that it actually depends very much on exactly what kind of coding is used to accomplish the effect, Ron.
In most cases, this kind of slider effect is accomplished using some variation of JavaScript. While Google has said it is "trying" to have it's crawlers recognize text from scripts, it almost never works that way.
So it won't be flagged as "hidden" text, because in fact Google won't even consider it to exist on the page.
An easy way to test is to view the source for the page in question - you'll see that none of the words of text actually exist on the page in any form, even in the code.
For the ultimate example of this - go into Google Webmaster Tools and use the Fetch as Googlebot tool to fetch the page. Then you'll see exactly the content that googlebot will see. It won't see the text, therefor it can't index and rank it. Ergo no SEO benefit at all.
Where you could get into trouble is if you did have text on the page designed to make googlebot think the page is about one thing, while using this kind of scripted text to try to show the visitor something completely different and unrelated. Google could then suspect you of cloaking and penalize accordingly. (Cloaking is when you intentionally show googlebot one thing and the user something different for nefarious purposes)
But if you're adding the text as a usability enhancement for your visitors in a way that googlebot doesn't happen to understand, you won't get any SEO benefit from it, but you also shouldn't be penalized for it.
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with 301 redirects showing as 200 when crawled using RogerBot
Hi guys, I recently did an audit for a client and ran a crawl on the site using RogerBot. We quickly noticed that all but one page was showing as status code 200, but we knew that there were a lot of 301 redirects in place. When our developers checked it, they saw the pages as 301s, as did the Moz toolbar. If page A redirected to page B, our developers and the Moz toolbar saw page A as 301 and page B as 200. However the crawl showed both page A and page B as 200. Does anyone have any idea why the crawl may have been showing the status codes as 200? We've checked and the redirect is definitely in place for the user, but our worry is that there could be an issue with duplicate content if a crawler isn't picking up on the 301 redirect. Thanks!
Technical SEO | | Welford-Media0 -
Big page of clients - links to individual client pages with light content - not sure if canonical or no-follow - HELP
Not sure what best practice here is: http://www.5wpr.com/clients/ Is this is a situation where I'm best off adding canonical tags back to the main clients page, or to the practice area each client falls under? No-following all these links and adding canonical? No-follow/No-index all client pages? need some advice here...
Technical SEO | | simplycary0 -
How do I fix issue regarding near duplicate pages on website associated to city OR local pages?
I am working on one e-commerce website where we have added 300+ pages to target different local cities in USA. We have added quite different paragraphs on 100+ pages to remove internal duplicate issue and save our website from Panda penalty. You can visit following page to know more about it. And, We have added unique paragraphs on few pages. But, I have big concerns with other elements which are available on page like Banner Gallery, Front Banner, Tool and few other attributes which are commonly available on each pages exclude 4 to 5 sentence paragraph. I have compiled one XML sitemap with all local pages and submitted to Google webmaster tools since 1st June 2013. But, I can see only 1 indexed page by Google on Google webmaster tools. http://www.bannerbuzz.com/local http://www.bannerbuzz.com/local/US/Alabama/Vinyl-Banners http://www.bannerbuzz.com/local/MO/Kansas-City/Vinyl-Banners and so on... Can anyone suggest me best solution for it?
Technical SEO | | CommercePundit0 -
Rel Canonical tag using Wordpress SEO plugin
Hi team I hope this is the right forum for asking this question. I have a site http://hurunuivillage.com built on Wordpress 3.5.1 using a child theme on Genesis 1.9. We're using Joost's Wordpress SEO plugin and I thought it was configured correctly but the Crawl Diagnostics report has identified an issue with the Rel Canonical tag on the sites pages. I have not edited the plugin settings so am surprised the SEOMoz Crawl has picked up a problem. Example: Page URL is http://hurunuivillage.com/ Tag Value http://hurunuivillage.com/ (exactly the same) Page Authority 39 Linking Root Domains 23 Source Code Considering the popularity of the plugin I'm surprised I have not been able to find tutorials to find what I'm doing wrong or should be doing better. Thanks in advance. Best Nic
Technical SEO | | NicDale0 -
SEO problems from moving from several pages to one accordian
Ive read other posts that say using accordion is not detrimental to SEO, and for conversion optimization we want to take several of our existing pages and make them into one accordion. But what will this do to seo and duplicate content as I redirect the old pages to anchors in the accordion? I would think this would be a dup content problem as www.oldinfo1 www.oldinfo2 will now have their content on the same page but I will be redirecting them to www.newpage#oldinfo1 www.newpage#oldinfo2 Is there a way around duplicate content problems?
Technical SEO | | JohnBerger0 -
Proper way to 404 a page on an Ecommerce Website
Hello. I am working on a website that has over 15000 products. When one of these is no longer available - like it's discontinued or something - the page it's on 302s to a 404 page. Example - www.greatdomain.com/awesome-widget Awesome widget is no longer available www. greatdomain.com/awesome-widget 302s to -www.greatdomain.com/404 page. For the most part, these are not worthy of 301s because of lack of page rank/suitable LPs, but is this the correct way to handle them for search engines? I've seen varying opinions. Thanks!
Technical SEO | | Blenny0 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0