Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How valuable is content "hidden" behind a JavaScript dropdown really?
-
I've come across a method implemented by some SEO agencies to fill up pages with somehow relevant text and hide it behind a javascript dropdown. Does Google fall for such cheap tricks?
You can see this method used on these pages for example (just scroll down to the bottom) - it's all in German, but you get the idea I guess:
http://www.insider-boersenbrief.de/
http://www.deko-und-kerzenshop.de/
How is you experience with this way of adding content to a site? Do you think it is valuable or will it get penalised?
-
Hey guys -
Good question here. You are right, JFKORN, that the scenario I described in my post was where content that should be accessible to Google was hidden behind Javascript. Of course, Google is now indexing Javascript and can parse it quite well, so I'm not sure it still holds true, but I still recommend, to be safe, to not serve content using Javascript.
It seems to me, though, that you are asking the opposite. But what they are doing here seems to be legit to me. In my mind, it is not any different from simply using a collapsible DIV to put tabs onto a page, like on this page: http://www.rei.com/product/812097/black-diamond-posiwire-quickpack-quickdraw-set-package-of-6. I would actually say that it's fine to do this. But, be careful with the content because you do not want to get into "stuffing" the pages with keywords, which can hurt your rankings, even without an official penalty. I've seen this more as an assumed algorithmic penalty that then went away when the text was removed.
So be careful, but I don't think you'd be doing anything greyhat here.
-
Thank you for the reply. I checked the link you posted, good information there. The only thing I was thinking about: The scenario John described wasn't necessarily content hidden behind an accessible dropdown. I'm still wondering if this makes any difference to Google. Hiding content to users completely or giving them the choice to display it by clicking the dropdown button seems different to me. One could also do this using CSS, just like with CSS dropdown navigation. There wouldn't even have to be any JS involved. Seems all pretty grey-hat to me though.
-
UNANIMOUS. Dont do it. We had several sites we were working on, from an acquisition, that had it hidden and did some extensive research last month and got consistent feedback that it will be picked up by google.
This guys name is john doherty. He is an active contributor to seomoz and I have read some great seo articles from him.....in this one he gives an example of an seo audit and what to make sure you look for.....
http://www.johnfdoherty.com/seo-facepalms-dont-hide-content-behind-javascript/
Without any lack of clarity he tells you not to do it.....we got the same feedback from several other folks in seo at the agency level.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with "random" content
Hi, I'm creating a page of 300+ in the near future, on which the content basicly will be unique as it can be. However, upon every refresh, also coming from a search engine refferer, i want the actual content such as listing 12 business to be displayed random upon every hit. So basicly we got 300+ nearby pages with unique content, and the overview of those "listings" as i might say, are being displayed randomly. Ive build an extensive script and i disabled any caching for PHP files in specific these pages, it works. But what about google? The content of the pages will still be as it is, it is more of the listings that are shuffled randomly to give every business listing a fair shot at a click and so on. Anyone experience with this? Ive tried a few things in the past, like a "Last update PHP Month" in the title which sometimes is'nt picked up very well.
Technical SEO | Jul 20, 2020, 12:02 PM | Vanderlindemedia0 -
How can I Style Long "List Posts" in Wordpress?
Hi All, I have been working on a list-post which spans over 100 items. Each item on the list has a quick blurb to explain it, an image and a few resource links. I am trying to find an attractive way to present this long list post in Wordpress. I have seen several sites with long list posts however; they place their items one on top of the other which yields a VERY long page and the end user has to do a lot of scrolling. Others turn their lists into slideshows, but I have no data on how slides perform against 10-mile-long-lists which load in 1 page. I would like to do something similar to what List25.com does as they present about 5-10 items per page and they seem to have pagination. The pagination part I understand however; is there a shortcode plugin to format lists in an attractive way just like list25?
Technical SEO | Dec 27, 2013, 5:42 PM | IvanC0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | Jun 2, 2013, 12:00 PM | reidsteven750 -
"nofollow pages" or "duplicate content"?
We have a huge site with lots of geographical-pages in this structure: domain.com/country/resort/hotel domain.com/country/resort/hotel/facts domain.com/country/resort/hotel/images domain.com/country/resort/hotel/excursions domain.com/country/resort/hotel/maps domain.com/country/resort/hotel/car-rental Problem is that the text on ie. /excursions is often exactly the same on .../alcudia/hotel-sea-club/excursion and .../alcudia/hotel-beach-club/excursion The two hotels offer the same excursions, and the intro text on the pages are the exact same throughout the entire site. This is also a problem on the /images and /car-rental pages. I think in most cases the only difference on these pages is the Title, description and H1. These pages do not attract a lot of visits through search-engines. But to avoid them being flagged as duplicate content (we have more than 4000 of these pages - /excursions, /maps, /car-rental, /images), do i add a nofollow-tag to these, do i block them in robots.txt or should i just leave them and live with them being flagged as duplicate content? Im waiting for our web-team to add a function to insert a geographical-name in the text, so i could add ie #HOTELNAME# in the text and thereby avoiding the duplicate text. Right now we have intros like: When you visit the hotel ... instead of: When you visit Alcudia Sea Club But untill the web-team has fixed these GEO-tags, what should i do? What would you do and why?
Technical SEO | Aug 6, 2012, 9:32 AM | alsvik0 -
Does the rel="bookmark" tag have any SEO impication?
I'm assuming the rel="bookmark" tag doesn't have any SEO implications but I just wanted to make sure it wasn't viewed like a nofollow by search engines.
Technical SEO | Jul 6, 2012, 11:13 PM | eli.boda0 -
Why crawl error "title missing or empty" when there is already "title and meta desciption" in place?
I've been getting 73 "title missing or empty" warnings from SEOMOZ crawl diagnostic. This is weird as I've installed yoast wordpress seo plugin and all posts do have title and meta description. But why the results here.. can anyone explain what's happening? Thanks!! Here are some of the links that are listed with "title missing, empty". Almost all our blog posts were listed there. http://www.gan4hire.com/blog/2011/are-you-here-for-good/ http://www.gan4hire.com/blog/2011/are-you-socially-awkward/
Technical SEO | Nov 5, 2011, 3:14 PM | JasonDGreatMaeM3.png TLcD8.png
0 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | Jul 13, 2011, 5:13 PM | Scott-Thomas0 -
Which pages to "noindex"
I have read through the many articles regarding the use of Meta Noindex, but what I haven't been able to find is a clear explanation of when, why or what to use this on. I'm thinking that it would be appropriate to use it on: legal pages such as privacy policy and terms of use
Technical SEO | Mar 30, 2011, 2:01 PM | mmaes
search results page
blog archive and category pages Thanks for any insight of this.0