Can Googlebot crawl the content on this page?
-
Hi all,
I've read the posts in Google about Ajax and javascript (https://support.google.com/webmasters/answer/174992?hl=en) and also this post: http://moz.com/ugc/can-google-really-access-content-in-javascript-really.
I am trying to evaluate if the content on this page, http://www.vwarcher.com/CustomerReviews, is crawlable by Googlebot? It appears not to be. I perused the sitemap and don't see any ugly Ajax URLs included as Google suggests doing. Also, the page is definitely indexed, but appears the content is only indexed via its original source (Yahoo!, Citysearch, Google+, etc.).
I understand why they are using this dynamic content, because it looks nice to an end-user and requires little to no maintenance. But, is it providing them any SEO benefit? It appears to me that it would be far better to take these reviews and simply build them into HTML.
Thoughts?
-
Thanks so much Bill and Brian. This is exactly what I was thinking. I did the same thing Bill suggested initially and took a snippet from one of the reviews and did a verbatim search and got nothing. What I thought this told me was that yes, the page was indexed, but not the content. The fact that the cached version renders the content from the javascript only shows that the script was executed, not necessarily that any of the content it contains was actually indexed.
From an SEO standpoint I think this is valuable content that the dealer would very much want indexed. While the service providing the javascript might be very convenient, and the majority of end users might be able to consume the content, the fact that it's not searchable, to me, means it's an opportunity lost.
Thanks again everyone.
-
Bill is right. The page is indexed and cached, however googlebot cannot read the reviews. If you view the cache (cache:http://www.vwarcher.com/CustomerReviews) and then click "Text-only version" in the upper right, you'll see that those reviews are not there (google can't crawl them).
-
Dana,
Yes, Google has indexed the page. However, if you view the source code of the page you won't see any text of the customer reviews. Even if you view the cached version, there is text there: but it's not necessarily text on the page. If you view the cached version, and then see the source code of the page, again, the content is not there. The reviews are not there in the source code.
Let's take this one step further. Search Google for one of the testimonials or a part of one: "Marc Palermo is a great customer service" which is NOT indexed in Google.
Google sees the source code but doesn't appear to be indexing the content.
-
Hehe no problem at all Dana. Glad to have helped stand in for your coffee
-Andy
-
Hah! Thanks Andy. Must not have had enough coffee this morning. I didn't even think of looking at the cache...so obvious, lol! Thanks so much. You are spot on.
-
Google appears to have the page cached, so I would say there are no real issues.
Just do a cache:http://www.vwarcher.com/CustomerReviews and you can see what Google currently has.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content for www and non-www. Help!
Hi guys, having a bit of a tough time here... MOZ is reporting duplicate content for 21 pages on eagleplumbing.co.nz, however the reported duplicate is the www version of the page. For example: http://eagleplumbing.co.nz and http://www.eagleplumbing.co.nz are considered duplicates (see screenshot attached) Currently in search console I have just updated the non-www version to be set as the preferred version (I changed this back and forth twice today because I am confused!!!). Does anyone know what the correct course of action should be in this case? Things I have considered doing include: changing the preferred version to the www version in webmaster tools, setting up 301 redirects using a wordpress plugin called Eggplant 301 redirects. I have been doing some really awesome content creation and have created some good quality citations, so I think this is only thing that is eaffecting my rank. Any help would be greatly appreciated. view?usp=sharing
Technical SEO | | QRate0 -
New pages need to be crawled & indexed
Hi there, When you add pages to a site, do you need to re-generate an XML site map and re-submit to Google/Bing? I see the option in Google Webmaster Tools under the "fetch as Google tool" to submit individual pages for indexing, which I am doing right now. Thanks,
Technical SEO | | SSFCU
Sarah0 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | | DHS_SH0 -
Content on top-level-domain vs. content on subpage
Hello Seomoz community, I just built a new website, mainly for a single affiliate programm and it ranks really well at google. Unfortunately the merchant doesn’t like the name of my domain, that’s why I was thrown out of the affiliate program. So suppose the merchant is a computer monitor manufacturer and his name is “Digit”. The name of my domain is something like monitorsdigital.com at the moment. (It’s just an example, I don’t own this URL). The structure of my website is: 1 homepage with much content on it + a blog. The last 5 blog entries are displayed on the homepage. Because I got kicked out of the affiliate program I want to permanent redirect monitorsdigital.com to another domain. But what should the new website look like? I have two possibilities: Copy the whole monitorsdigital website to a new domain, called something like supermonitors.com. Integrate the monitorsdigital website into my existing website about different monitor manufacturers. E.g.: allmonitors.com/digit-monitors.html (that url is permitted by the merchant) What do you think is the better way? I just got the impression, that it seems to be a little easier to rank high with a top-level-domain (www.supermonitors.com) than with a subpage (www.allmonitors.com/digit-monitors.html). However the subpage can benefit from the domain authority, that was generated by other subpages. Thanks for your help and best regards MGMT
Technical SEO | | MGMT0 -
Off-page SEO and on-page SEO improvements
I would like to know what off-page SEO and on-page SEO improvements can be made to one of our client websites http://www.nd-center.com Best regards,
Technical SEO | | fkdpl2420 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
If you only want your home page to rank, can you use rel="canonical" on all your other pages?
If you have a lot of pages with 1 or 2 inbound links, what would be the effect of using rel="canonical" to point all those pages to the home page? Would it boost the rankings of the home page? As I understand it, your long-tail keyword traffic would start landing on the home page instead of finding what they were looking for. That would be bad, but might be worth it.
Technical SEO | | watchcases0