Javascript late loaded content not read by Gogglebot
-
Hi,
We have a page with some good "keyword" content (user supplied comment widget), but there was a design choice made previously to late load it via JavaScript. This was to improve performance and the overall functionality relies on JavaScript. Unfortunately since it is loaded via js, it isn't read by Googlebot so we get no SEO value.
I've read Google doesn't weigh
<noscript>content as much as regular content. is this true? Once option is just to load some of the content via <noscript> tags. I just want to make sure Google still reads this content.</p> <p>Another option is to load some of the content via simple html when loading the page. If JavaScript is enabled, we'd hide this "read only" version via css and display the more dynamic user friendly version. - Would changing display based on js enabled be deemed as cloaking? Since non-js users would see the same thing (and this provides a ways for them to see some of the functionality in the widget, it is an overall net gain for those users too).</p> <p>In the end, I want Google to read the content but trying to figure out the best way to do so.</p> <p>Thanks,</p> <p>Nic</p> <p> </p></noscript>
-
If the content is too late, you're right, the Googlebot may not grab it. However, Google is getting better and better at indexing AJAX content that's loaded after the fact. On one of the sites I work on, we really didn't want to go through the whole process of serving up an HTML snapshot to Googlebot (outlined http://code.google.com/web/ajaxcrawling/). About a month ago, I did a search in Google based on the AJAX content, and it returned the page, meaning Google is finding that AJAX content and indexing it! They're indexing comments now (see http://www.searchenginejournal.com/google-indexing-facebook-comments/35594/) as well, like Disqus and Facebook comments. What kind of comments widget are you loading that Google can't get at? Maybe they'll be able to index them soon?
I would guess that Google would devalue
<noscript>text, as almost everyone has JavaScript enabled. Otherwise, everyone would be keyword stuffing in their <noscript> tags.</p> <p style="color: #5e5e5e;">The option you outlined sounds like it could work. If you're just taking the content from JavaScript, and loading it in the HTML if the user doesn't have JavaScript enabled. Google is actually suggesting in their ajax crawling guide to actually serve the Googlebot a static page instead of the page with AJAX content, which seems much closer to cloaking than the option you're suggesting.</p></noscript>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question About Thin Content
Hello, We have an encyclopedia type page on our e-commerce site. Basically, it's a page with a list of terms related to our niche, product definitions, slang terms, etc. The terms on the encyclopedia page are each linked to their own page that contains the term and a very short definition (about 1-2 sentences). The purpose of these is to link them on product pages if a product has a feature or function that may be new to our customers. We have about 82 of these pages. Are these pages more likely to help us because they're providing information to visitors, or are they likely to hurt us because of the very small amount of content on each page? Thanks for the help!
Technical SEO | | mostcg0 -
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
Duplicate Tag Content Mystery
Hello Moz Communtiy! i am also having error of Duplicate Tag Content Mystery like: http://www.earnmoneywithgoogleadsense.com/tag/blog-post/ http://www.earnmoneywithgoogleadsense.com/tag/effective-blog-post/ Pages are same. I have 100+ Error on website so how can i remove this error? DO you have any tutorial based on this? Can i change canonical url at once or i need to set it one by one? If you have any video basis on it, i will recommend.
Technical SEO | | navneetkumar7860 -
Moving content
I have www.SiteA.com which contains a number of sections of content, a section of which (i.e. www.SiteA.com/sectionA), we would like to move to a new domain www.SiteB.com Definitely we will ensure that a redirect strategy is in place and that we submit a sitemap for SiteB Three Questions 1. Anything else I am missing from the migration plan? 2. Since we are only moving part of SiteA to SiteB, is there another way of telling Google that we changed address for that section or are the 301s enough? 3. Currently, Section A (under SiteA) contains a subsection where we were posting an article a day. In the new site (SiteB), we decided to drop this subsection and write content (but not "exactly" the same content) under a new section. During migration, how should we handle the subsection that we have decided to stop writing? Should we: A. Import the content into SiteB and call it archives and then redirect all the urls from subsection under SiteA to the archives under SiteB? OR B. Do not move the content but redirect all the pages (365 in total) to where we think the user would be more interested in going to on SiteB? Note: A colleague of mine is worried that since the subsection has good content he thinks its necessary to actually move the content to SiteB. But again, looking at the views for the archives it caters for 1% of the the total views of this section. In other words, people only view the article on the day it is written. I hope I was clear 🙂 Your help is appreciated Thank you
Technical SEO | | seo12120 -
Duplicate content in Magento
Hi all We got some serious issues with duplicate content on a Magento site that we are marketing. For example: http://www.citcop.se/varmepumpar-luft-luft/panasonic/panasonic-nordic-ce9nke-5-0kw http://www.citcop.se/panasonic/panasonic-nordic-ce9nke-5-0kw http://www.citcop.se/panasonic-nordic-ce9nke-5-0kw All of the above seem to work just fine as it is now but since they are excatly the same product they should ofcourse do a 301 redirect to the main page. Any ideas on how to sort this out in Magnto without having to resort to manual work in .htaccess? Have a great day Fredrik
Technical SEO | | Resultify0 -
Linking to unrelated content
Hi, Just wanted to know, linking to unrelated content will harm the site? I know linking to unrelated content is not good. But wanted to know weather any chances are there or not. I have a site related to health and the other one related to technology. The technology site is too good having PR 6 and very good strong backlinks. And the health related site has very much tough competition, So i wanted to know may be i could link this health site to technology site to get good link from it. Can you suggest me about it. waiting for your replies...
Technical SEO | | Dexter22387874870 -
How do you properly handle syndicated content?
The same piece of content is pulled in and presented (syndicated) within a frame on different web sites (owned by the same company). However, I would like only one web site to rank on Google's search results for that content. How do I set this up? Thanks, claudia
Technical SEO | | claudmar0 -
Javascript
Hi mozzers, For my website I use various affiliate programs on commission junction. Some of the text ads are in javascript. Will google read the text ads or not? Cheers, Peter
Technical SEO | | PeterM220