Can Googlebot crawl the content on this page?
-
Hi all,
I've read the posts in Google about Ajax and javascript (https://support.google.com/webmasters/answer/174992?hl=en) and also this post: http://moz.com/ugc/can-google-really-access-content-in-javascript-really.
I am trying to evaluate if the content on this page, http://www.vwarcher.com/CustomerReviews, is crawlable by Googlebot? It appears not to be. I perused the sitemap and don't see any ugly Ajax URLs included as Google suggests doing. Also, the page is definitely indexed, but appears the content is only indexed via its original source (Yahoo!, Citysearch, Google+, etc.).
I understand why they are using this dynamic content, because it looks nice to an end-user and requires little to no maintenance. But, is it providing them any SEO benefit? It appears to me that it would be far better to take these reviews and simply build them into HTML.
Thoughts?
-
Thanks so much Bill and Brian. This is exactly what I was thinking. I did the same thing Bill suggested initially and took a snippet from one of the reviews and did a verbatim search and got nothing. What I thought this told me was that yes, the page was indexed, but not the content. The fact that the cached version renders the content from the javascript only shows that the script was executed, not necessarily that any of the content it contains was actually indexed.
From an SEO standpoint I think this is valuable content that the dealer would very much want indexed. While the service providing the javascript might be very convenient, and the majority of end users might be able to consume the content, the fact that it's not searchable, to me, means it's an opportunity lost.
Thanks again everyone.
-
Bill is right. The page is indexed and cached, however googlebot cannot read the reviews. If you view the cache (cache:http://www.vwarcher.com/CustomerReviews) and then click "Text-only version" in the upper right, you'll see that those reviews are not there (google can't crawl them).
-
Dana,
Yes, Google has indexed the page. However, if you view the source code of the page you won't see any text of the customer reviews. Even if you view the cached version, there is text there: but it's not necessarily text on the page. If you view the cached version, and then see the source code of the page, again, the content is not there. The reviews are not there in the source code.
Let's take this one step further. Search Google for one of the testimonials or a part of one: "Marc Palermo is a great customer service" which is NOT indexed in Google.
Google sees the source code but doesn't appear to be indexing the content.
-
Hehe no problem at all Dana. Glad to have helped stand in for your coffee
-Andy
-
Hah! Thanks Andy. Must not have had enough coffee this morning. I didn't even think of looking at the cache...so obvious, lol! Thanks so much. You are spot on.
-
Google appears to have the page cached, so I would say there are no real issues.
Just do a cache:http://www.vwarcher.com/CustomerReviews and you can see what Google currently has.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Could a dropdown list of products dilute the page content?
Hi all, On our site, due to the fact we only have some 120 or so products split across 5 different categories we have a dropdown menu that displays all of the products in the menu. Forgetting usability for a moment, my question is whether by having links to all of products appear on each and every page (because they are in the main menu), are we diluting the content on the page. For example, if I take a particular product - the main phrase I want that page to be discovered for is "perspex sheet". This phrase does appear in the H1, H2 and within the main description of the product - but, as mentioned, each of our pages has some 120+ internal links due to the menu which contain all sorts of product names that arent relevant to "perspex sheet". The Moz report does flag a Medium issue on every page due to the number of internal links. I don't know whether I'm making a fuss about nothing, or whether this does have some serious side effects. It's an eCommerce site so of course im nervous of making changes that could have an adverse affect on our rankings. I thought there used to be a tool on Moz that showed what phrases a page was optimised for but i can no longer find that tool. Any help would be greatly appreciated. Regards,
Technical SEO | | SimplyPlastic
Al0 -
Duplicate Content Showing up on Moz Crawl | www. vs. no-www.
Hello Moz Community! I am new to SEO, Moz and this is my first question. My questions; I have a client that is getting flagged for Duplicate Content. He is getting flagged for having two domains that have the same content i.e. www.mysite.com & mysite.com. I read into this and set up a 301 redirect through my hosting site. I evaluated which site had a stronger Page Authority and had the weaker site redirect to the stronger site. However, I am still getting hit for Duplicate pages caused by the www.mysite.com & mysite.com being duplicates. How should I go about resolving this? Is this an example of a Canonical tag needed in the head of the HTML? Any direction is appreciated. Thank You. B/R Will H.
Technical SEO | | MarketingChimp100 -
Can Google Crawl This Page?
I'm going to have to post the page in question which i'd rather not do but I have permission from the client to do so. Question: A recruitment client of mine had their website build on a proprietary platform by a so-called recruitment specialist agency. Unfortunately the site is not performing well in the organic listings. I believe the culprit is this page and others like it: http://www.prospect-health.com/Jobs/?st=0&o3=973&s=1&o4=1215&sortdir=desc&displayinstance=Advanced Search_Site1&pagesize=50000&page=1&o1=255&sortby=CreationDate&o2=260&ij=0 Basically as soon as you deviate from the top level pages you land on pages that have database-query URLs like this one. My take on it is that Google cannot crawl these pages and is therefore having trouble picking up all of the job listings. I have taken some measures to combat this and obviously we have an xml sitemap in place but it seems the pages that Google finds via the XML feed are not performing because there is no obvious flow of 'link juice' to them. There are a number of latest jobs listed on top level pages like this one: http://www.prospect-health.com/optometry-jobs and when they are picked up they perform Ok in the SERPs, which is the biggest clue to the problem outlined above. The agency in question have an SEO department who dispute the problem and their proposed solution is to create more content and build more links (genius!). Just looking for some clarification from you guys if you don't mind?
Technical SEO | | shr1090 -
Can I canonical the same page?
I have a site where I have 500+ Page listing pages and I would like to rel=canonical them to the master page. Example: http://www.example.com//articles?p=18 OR http://www.example.com/articles?p=65 I plan on adding this to the section from of the page template so it goes to all pages - When I do this, I will also add the canonical to the page I am directing the canonical. Is this a bad thing? Or allowed?
Technical SEO | | JoshKimber0 -
Can Google show the hReview-Aggregate microformat in the SERPs on a product page if the reviews themselves are on a separate page?
Hi, We recently changed our eCommerce site structure a bit and separated our product reviews onto a a different page. There were a couple of reasons we did this : We used pagination on the product page which meant we got duplicate content warnings. We didn't want to show all the reviews on the product page because this was bad for UX (and diluted our keywords). We thought having a single page was better than paginated content, or at least safer for indexing. We found that Googlebot quite often got stuck in loops and we didn't want to bury the reviews way down in the site structure. We wanted to reduce our bounce rate a little, so having a different reviews page could help with this. In the process of doing this we tidied up our microformats a bit too. The product page used to have to three main microformats; hProduct hReview-Aggregate hReview The product page now only has hProduct and hReview-Aggregate (which is now nested inside the hProduct). This means the reviews page has hReview-Aggregate and hReviews for each review itself. We've taken care to make sure that we're specifying that it's a product review and the URL of that product. However, we've noticed over the past few weeks that Google has stopped feeding the reviews into the SERPs for product pages, and is instead only feeding them in for the reviews pages. Is there any way to separate the reviews out and get Google to use the Microformats for both pages? Would using microdata be a better way to implement this? Thanks,
Technical SEO | | OptiBacUK
James0 -
Advice on display this content on my page for search engines
Hi, my website http://www.in2town.co.uk/Holiday-News is about bringing travel and holiday news to our readers of our lifestyle magazine but i am having problems at the moment with the layout. What i mean by this is, i have written content on the page as an introduction so google knows what this section of the site is about but to be honest it looks rubbish with having the introduction there and i would like to know if i am doing the right thing by having the content there for google to know what my site is about. I have tried taking it away and noticed i dropped in the rankings and when i have put it back up i go up in the rankings, can anyone please give me some advice over this issue
Technical SEO | | ClaireH-1848860 -
Images on page appear as 404s to Googlebot
When I fetch my website as Googlebot it returns 404s for all the images on the page. This despite the fact that each image is hyperlinked! What could be causing this issue? Thanks!
Technical SEO | | Netpace0 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0