Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How valuable is content "hidden" behind a JavaScript dropdown really?
-
I've come across a method implemented by some SEO agencies to fill up pages with somehow relevant text and hide it behind a javascript dropdown. Does Google fall for such cheap tricks?
You can see this method used on these pages for example (just scroll down to the bottom) - it's all in German, but you get the idea I guess:
http://www.insider-boersenbrief.de/
http://www.deko-und-kerzenshop.de/
How is you experience with this way of adding content to a site? Do you think it is valuable or will it get penalised?
-
Hey guys -
Good question here. You are right, JFKORN, that the scenario I described in my post was where content that should be accessible to Google was hidden behind Javascript. Of course, Google is now indexing Javascript and can parse it quite well, so I'm not sure it still holds true, but I still recommend, to be safe, to not serve content using Javascript.
It seems to me, though, that you are asking the opposite. But what they are doing here seems to be legit to me. In my mind, it is not any different from simply using a collapsible DIV to put tabs onto a page, like on this page: http://www.rei.com/product/812097/black-diamond-posiwire-quickpack-quickdraw-set-package-of-6. I would actually say that it's fine to do this. But, be careful with the content because you do not want to get into "stuffing" the pages with keywords, which can hurt your rankings, even without an official penalty. I've seen this more as an assumed algorithmic penalty that then went away when the text was removed.
So be careful, but I don't think you'd be doing anything greyhat here.
-
Thank you for the reply. I checked the link you posted, good information there. The only thing I was thinking about: The scenario John described wasn't necessarily content hidden behind an accessible dropdown. I'm still wondering if this makes any difference to Google. Hiding content to users completely or giving them the choice to display it by clicking the dropdown button seems different to me. One could also do this using CSS, just like with CSS dropdown navigation. There wouldn't even have to be any JS involved. Seems all pretty grey-hat to me though.
-
UNANIMOUS. Dont do it. We had several sites we were working on, from an acquisition, that had it hidden and did some extensive research last month and got consistent feedback that it will be picked up by google.
This guys name is john doherty. He is an active contributor to seomoz and I have read some great seo articles from him.....in this one he gives an example of an seo audit and what to make sure you look for.....
http://www.johnfdoherty.com/seo-facepalms-dont-hide-content-behind-javascript/
Without any lack of clarity he tells you not to do it.....we got the same feedback from several other folks in seo at the agency level.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with "random" content
Hi, I'm creating a page of 300+ in the near future, on which the content basicly will be unique as it can be. However, upon every refresh, also coming from a search engine refferer, i want the actual content such as listing 12 business to be displayed random upon every hit. So basicly we got 300+ nearby pages with unique content, and the overview of those "listings" as i might say, are being displayed randomly. Ive build an extensive script and i disabled any caching for PHP files in specific these pages, it works. But what about google? The content of the pages will still be as it is, it is more of the listings that are shuffled randomly to give every business listing a fair shot at a click and so on. Anyone experience with this? Ive tried a few things in the past, like a "Last update PHP Month" in the title which sometimes is'nt picked up very well.
Technical SEO | | Vanderlindemedia0 -
How do I "undo" or remove a Google Search Console change of address?
I have a client that set a change of address in Google Search Console where they informed Google that their preferred domain was a subdomain, and now they want Google to also consider their base domain (without the change of address). How do I get the change of address in Google search console removed?
Technical SEO | | KatherineWatierOng0 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
"Search Box Optimization"
A client of ours recently received en email from a random SEO "company" claiming they could increase website traffic using a technique known as "search box optimization". Essentially, they are claiming they can insert a company name into the autocomplete results on Google. Clearly, this isn't a legitimate service - however, is it a well known technique? Despite our recommendation to not move forward with it, the client is still very intrigued. Here is a video of a similar service:
Technical SEO | | McFaddenGavender
https://www.youtube.com/watch?v=zW2Fz6dy1_A0 -
New "Static" Site with 302s
Hey all, Came across a bit of an interesting challenge recently, one that I was hoping some of you might have had experience with! We're currently in the process of a website rebuild, for which I'm really excited. The new site is using Markdown to create an entirely static site. Load-times are fantastic, and the code is clean. Life is good, apart from the 302s. One of the weird quirks I've realized is that with oldschool, non-server-generated page content is that every page of the site is an Index.html file in a directory. The resulting in a www.website.com/page-title will 302 to www.website.com/page-title/. My solution off the bat has been to just be super diligent and try to stay on top of the link profile and send lots of helpful emails to the staff reminding them about how to build links, but I know that even the best laid plans often fail. Has anyone had a similar challenge with a static site and found a way to overcome it?
Technical SEO | | danny.wood1 -
Duplicate Content
We have a ton of duplicate content/title errors on our reports, many of them showing errors of: http://www.mysite.com/(page title) and http://mysite.com/(page title) Our site has been set up so that mysite.com 301 redirects to www.mysite.com (we did this a couple years ago). Is it possible that I set up my campaign the wrong way in SEOMoz? I'm thinking it must be a user error when I set up the campaign since we already have the 301 Redirect. Any advice is appreciated!
Technical SEO | | Ditigal_Taylor0 -
Is "last modified" time in XML Sitemaps important?
My Tech lead is concerned that his use of a script to generate XML sitemaps for some client sites may be causing negative issues for those sites. His concern centers around the fact that the script generates a sitemap which indicates that every URL page in the site was last modified at the exact same date and time. I have never heard anything to indicate that this might be a problem, but I do know that the sitemaps I generate for other client sites can choose server response or not. What is the best way to generate the sitemap? Last mod from actual time modified, or all set at one date and time?
Technical SEO | | ShaMenz0 -
Which pages to "noindex"
I have read through the many articles regarding the use of Meta Noindex, but what I haven't been able to find is a clear explanation of when, why or what to use this on. I'm thinking that it would be appropriate to use it on: legal pages such as privacy policy and terms of use
Technical SEO | | mmaes
search results page
blog archive and category pages Thanks for any insight of this.0