Dust.js Client-side JavaScript Templates & SEO
-
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js
Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript.
Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”.
Read about Linkedin switching over to Dust.js
Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.”
Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical.
But, this technique is Cloaking right?
From Wikipedia:
“Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.”
Matt Cutts on Cloaking
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me.
If our content is the same, are we cloaking?
Should we be developing our site like this for ease of development and performance?
Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines?
Thank you in advance for ANY help with this!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doing URL change losses SEO ranking or not?
Hi Webmasters, I would like to move shipwaves.me to shipwaves.aeHowever, our website is concentrated on middle east countries and moreover, we have though .me is middle east [United Arab Emirates} and later with SEO advice, we have taken .ae.Besides, our confusion is if the website move from Shipwaves.me to the new domain shipwaves.ae this makes our SEO ranking loss or not?some of our keywords has been started showing on various search pages. So, anyone knows about this concern, please let me know.
White Hat / Black Hat SEO | | LayaPaul0 -
SEO Links in Footer?
Hi, One of my clients uses a pretty powerful SEO tool, won't mention the name. They now have a "link equity" tool, which they are using on a lot of their client's sites, which include tons of fortune 500 companies. It involves add footer links to your site that change based on the content of the page they are on. The machine learning tries to figure out the most related pages and links to them with the heading tag of that page as the anchor text. Initially this sounds very spammy to me. But then, it seems a lot like "related products" tools that many companies use. The goal for this tool is to build up internal linking, especially for deeper pages on their site. They have over 10,000 currently. What are everyone's thoughts on this strategy?
White Hat / Black Hat SEO | | vetofunk2 -
Is toggle Good For seo
Hi there, I have Client Who dont want to show his content to publicly, So team decided to use toggle, So Google can also See Content, But i want bu sure. Does Google will really cache that Content?? Does it down my website Ranking?? Please any one can Help, I need urgent basis Thnx in advance Falguni
White Hat / Black Hat SEO | | iepl20010 -
Negative SEO Click Bot Lowering My CTR?
I am questioning whether one of our competitors is using a click bot to do negative SEO on our CTR for our industry's main term. Is there any way to detect this activity? Background: We've previously been hit by DoS attacks from this competitor, so I'm sure their ethics/morals wouldn't prevent them from doing negative SEO. We sell an insurance product that is only offered through broker networks (insurance agents) not directly by the insurance carriers themselves. However, our suspect competitor (another agency) and insurance carriers are the only ones who rank on the 1st page for our biggest term. I don't think the carrier sites would do very well since they don't even sell the product directly (they have pages w/ info only) Our site and one other agency site pops onto the bottom of page one periodically, only to be bumped back to page 2. I fear they are using a click bot that continuously bounces us out of page 1...then we do well relatively to the other pages on page 2 and naturally earn our way back to page 1, only to be pushed back to page 2 by the negative click seo...is my theory. Is there anything I can do to research whether my theory is right or if I'm just being paranoid?
White Hat / Black Hat SEO | | TheDude0 -
Can anyone explain some below SEO questions ?
Can we do link building like directory, article, press releases, classifieds, business listing, social bookmarking etc. We need to check DA, Alexa, Page Rank, cBlock IP before publishing any kind of Link but how much Max or Min. no. should be for consider any website like DA should be min 20-30-40 etc.. How can consider a natural links? Which type anchor text should be in any kind of links may be directory etc. In website interlinking we should put Exact Links or no need to put any links For.ex.my website is abc.com.au then we can put link for Website Design keywords or Should be long tail keyword. How can we do content marketing means we should post blog in internal website or need to create External Blog like BlogSpot, WordPress. In blog we should put any keyword link OR should be post without links. We can put link on no follow website. Why more website coming on Google first page but they are doing Spammy links like exact keywords links, unnatural links etc.. Thanks, Akhilesh
White Hat / Black Hat SEO | | dotlineseo0 -
Is linking out to different websites with the same C-Block IP bad for SEO?
Many SEOs state that getting (too many) links from the same C-Block IP is bad practice and should be avoided. Is this also applicable if one website links out to different websites with the same C-Block IP? Thus, website A, B and C (on the same server) link to website D (different server) could be seen as spam but is this the same when website D links to website A, B and C?
White Hat / Black Hat SEO | | TT_Vakantiehuizen0 -
What is your SEO agency doing in terms of link building for clients?
What are you or your SEO agency doing for your client's link building efforts? What are you (or the agency) doing yourself, or out-sourcing, or having the client do for link building? If a new client needs some serious link building done, what do you prescribe and implement straight off the bat? What are your go-to link building tactics for clients? What are the link building challenges faced by your agency in 2013/2014? What's working for your agency and what's not? Does your agency work closely with the client's marketing department to gain link traction? If so, what are collaborating on? What else might you be willing to share about your agencies link building practices? Thanks
White Hat / Black Hat SEO | | Martin_S0 -
How do you keep a record of your onsite SEO changes
Hi Everyone, I'm new to the whole SEO process, so was wondering if anyone can help me. I want to keep a record of all SEO activities in one place for the website i'm trying to optimise for. I have created an excel sheet which have the follwoing tabs -Overview & Rankings
White Hat / Black Hat SEO | | mcliddy
- Keyword Research Competitior Analysis
- Keyword Distribution Map Onpage SEO Link Ideas Link Research
-Link Building Log
- PPC Campaign Does this all seem correct?
Could anyone help in telling me what process you do to keep a record of all SEO onsite activity? I hope this isn't a stupid post, but help would be very much appreciated Many Thanks Matt0