Dust.js Client-side JavaScript Templates & SEO
-
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js
Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript.
Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”.
Read about Linkedin switching over to Dust.js
Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.”
Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical.
But, this technique is Cloaking right?
From Wikipedia:
“Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.”
Matt Cutts on Cloaking
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me.
If our content is the same, are we cloaking?
Should we be developing our site like this for ease of development and performance?
Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines?
Thank you in advance for ANY help with this!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain SEO
Hi, May I know for the keyword "engagement rings" which domain is the best in term of SEO perspective? www.engagement-rings.com www.engagementrings.com 3)www.engagement_rings.com Thank you
White Hat / Black Hat SEO | | KINSHUN1 -
Fred Update & Ecommerce
Hi I wondered if there had been any other insights since March about the Fred update or any other Google update? I don't think we were hit by Fred but in March we dropped out for a lot of keyword rankings, I just cannot pinpoint why. We are an ecommerce site, so some of our product/category pages don't have a huge amount of written content. We might have a couple of extra backlinks to disavow, but nothing major. Does anyone else have any insights? Thanks!
White Hat / Black Hat SEO | | BeckyKey1 -
Active Rain and SEO
I have been an active rain member for a long time. When I check my web site I can not find any links from Active Rain. I just updated my Active Rain profile and upgraded to their paid subscription. Can you tell me if this blog is creating a follow link back to my web site at www.RealEstatemarketLeaders.com the blog on active rain is here. at http://activerain.trulia.com/blogsview/4529309/hud-homes-for-sale-in-tri-cities-wa
White Hat / Black Hat SEO | | Brandon_Patton0 -
Question about local SEO when you serve many more cities than you have brick and mortar locations
My URL is: http://www.mollysmusic.org for the record.I run a music school that serves in-home lessons to a whole slew of cities. Since I only have 3 brick-and-mortar locations, I can't make google local profiles for all the cities served, but I want to get seen by those people searching in their own cities. Right now, our biggest competitor, takelessons.com, is top ranked for every single city you can think of, because they have individual web pages for every city served. Their content is repetitive and scrapey, and to me, that says "doorway page" which supposedly can get you de-indexed. I'm reluctant to do that because I'm afraid I'll get banned, but I have to compete. I also want a strategy that can scale when we move into new areas. Is there something that makes TakeLessons's content NOT a doorway page? What's the best practice for getting ranked in multiple individual cities if you run a service? Thanks in advance.
White Hat / Black Hat SEO | | mollysmusic0 -
Press Releases and SEO in 2013
Mozers, A few questions for the community: Distributing a press release through a service like 24-7pressrelease.com - is it a serious duplicate content issue when an identical press release is distributed to multiple sites with no canonical markup (as far as I can tell)? All of the backlinks in the press release are either nofollow or redirects. If there IS a duplicate content issue, will the website be affected negatively given the numerous Panda and Penguin refreshes? Why SHOULDN'T a company issue a press release to multiple sites if it actually has something legitimate to announce and the readership of a given site is the target demographic? For example, why shouldn't a company that manufactures nutritional health supplements issue the same press release to Healthy Living, Lifestyle, Health News, etc _with a link to the site?_I understand it's a method that can be exploited for SEO purposes, but can't all SEO methods be taken to an extreme? Seems to me that if this press release scenario triggers the duplicate content and/or link spam penalty(ies), I'd consider it a slight deficiency of Google's search algorithm. Any insight would be much appreciated. Thanks.
White Hat / Black Hat SEO | | b40040400 -
SEO problems with PR Newswires
Just been investigating PR newswires for the first time (despite having worked in PR for over a decade!) One of my clients has asked my to send out a news release via a newswire of my choice. I will not be posting the news release on my client's website, to avoid the most obvious duplication issue. Has anyone had SEO probs from newswires though? I just saw one which offered: "Minimum guaranteed number of media websites on which your release is posted" alarm bells!
White Hat / Black Hat SEO | | McTaggart0 -
How to run SEO tests you don't want to be associated with
A client has a competitor who is ranking above them for a highly competitive term they shouldn't really be able to rank for. I think I know how the site got there, and I think I can replicate it myself with a quick test, but it's definitely grey hat if not black hat to do so. I do not want my own sites and company to be damamged by the test, but i'd like to let the client know for sure, and also i'd love to know myself. The test should take about a week to run, there is no hacking involved or password stealing or anything damaging to another. How would you do such a test? I'm dubious about using my own server / site for it, but would a week really matter? Tom
White Hat / Black Hat SEO | | lethal0r0