User Reviews Question
-
On my e-commerce site, I have user reviews that cycle in the header section of my category pages. They appear/cycle via a snippet of code that the review program provided me with.
My question is...b/c the actual user-generated content is not in the page content does the google-bot not see this content? Does it not treat the page as having fresh content even though the reviews are new? Does the bot only see the code that provides the reviews?
Thanks in advance. Hopefully this question is clear enough.
-
Sure thing Jeff.
They can help in a few ways:
1. They add content to your pages (without you having to pay anything)
2. As you suggested, they can also send you some long tail traffic
3.Google can display the star rating of whatever it is you're reviewing via rich snippets:
http://www.dannyvince.net/wp-content/uploads/2012/03/engraved-gift-ideas-rich-snippets.jpg
However, they're a double-edged sword. If you're careful about keyword density and other on-page metrics (like I am), letting a bunch of people write the content of your site isn't the best idea.
I'm actually KILLING a competitor's site right now in the SERPs who have mostly UGC/reviews. A lot of this is because my on-page is leagues and bounds above theirs.
-
Thanks for the help Brian and Irving.
In what ways do those reviews help with SEO? Do they help with long-tail KW also?
-
Run a spider simulator to see what googlebot sees or look at the source code to see if the content is there.
Even if the content being added is not significant, if the content on the pages change often that will keep the bots coming back and respidering more often.
If it's random reviews populating randomly in the header though it won't really help for SEO. You need real reviews on that page that stay there and aggregate in order to help your SEO.
-
It depends on the way they're displayed, Jeff.
If the reviews are HTML, you bet your butt Googlebot can see them. If it's Java or Flash...that's another story.
Either way, that's not really the fresh content Google's looking for on page in my opinion.
When it comes to freshness, Google wants either a) significant amounts of content added to a site (ie. blog posts) or b) significant updates to existing content.
Either way, these reviews aren't likely to make much of a difference in terms of your rank.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
False Soft 404s, Shadow Bans, and Old User Generated Content
What are the best ways to keep old user generated content (UGC) pages from being falsely flagged by Google as soft 404s? I have tried HTML site maps to make sure no page is an orphaned but that has not solved the problem. Could crawled currently not indexed by explained by a shadow ban from Google? I have had problems with Google removing pages from SERPs without telling me about it. It looks like a lot of content is not ranking due to its age. How can one go about refreshing UGC without changing the work of the user?
Technical SEO | | STDCarriers0 -
Review rich snippet isn't appearing anymore
Hi, We have recently lost the review rich snippet for product that used to consistently show it for a long time. On top of that we have also lost the breadcrumb. The markup hasn't changed on these pages and gwt data tool tester doesn't show any anomalies. A few weeks back we have deployed a new property that list reviews from the first site without actually being under the same domain. Would that be an issue, review may have been considered as plagiarism somehow? Is there a way for us to confirm this theory? What are others factors that may have lead us to lose those rich snippet? Thanks
Technical SEO | | mattam0 -
Page that appears on SERPs is not the page that has been optimized for users
This may seem like a pretty newbie question, but I haven't been able to find any answers to it (I may not be looking correctly). My site used to rank decently for the KW "Gold name necklace" with this page in the search results:http://www.mynamenecklace.co.uk/Products.aspx?p=302This was the page that I was working on optimizing for user experience (load time, image quality, ease of use, etc.) since this page was were users were getting to via search. A couple months ago the Google SERP's started showing this page for the same query (also ranked a little lower, but not important for this specific question):http://www.mynamenecklace.co.uk/Products.aspx?p=314Which is a white gold version of the necklaces. This is not what most users have in mind (when searching for gold name necklace) so it's much less effective and engaging.How do I tell Google to go back to old page/ give preference to older page / tell them that we have a better version of the page / etc. without having to noindex any of the content? Both of these pages have value and are for different queries, so I can't canonical them to a single page. As far as external links go, more links are pointing to the Yellow gold version and not the white gold one.Any ideas on how to remedy this?Thanks.
Technical SEO | | Don340 -
Question about construction of our sitemap URL in robots.txt file
Hi all, This is a Webmaster/SEO question. This is the sitemap URL currently in our robots.txt file: http://www.ccisolutions.com/sitemap.xml As you can see it leads to a page with two URLs on it. Is this a problem? Wouldn't it be better to list both of those XML files as separate line items in the robots.txt file? Thanks! Dana
Technical SEO | | danatanseo0 -
Google Change of Address with Questionable Backlink Profile
We have a .com domain where we are 301-ing the .co.uk site into it before shutting it down - the client no longer has an office in the UK and wants to focus on the .com. The .com is a nice domain with good trust indicators. I've just redesigned the site, added a wad of healthy structured markup, had the duplicate content mostly rewritten - still finishing off this job but I think we got most of it with Copyscape. The site has not so many backlinks, but we're working on this too and the ones it does have are natural, varied and from trustworthy sites. We also have a little feature on the redesign coming up in .Net magazine early next year, so that will help. The .co.uk on the other hand has a fair few backlinks - 1489 showing in Open Site Explorer - and I spent a good amount of time matching the .co.uk pages to similar content on the .com so that the redirects would hopefully pass some pagerank. However, approximately a year later, we are struggling to grow organic traffic to the .com site. It feels like we are driving with the handbrake on. I went and did some research into the backlink profile of the .co.uk, and it is mostly made up of article submissions, a few on 'quality' (not in my opinion) article sites such as ezine, and the majority on godawful and broken spammy article sites and old blogs bought for seo purposes. So my question is, in light of the fact that the SEO company that 'built' these shoddy links will not reply to my questions as to whether they received a penalty notification or noticed a Penguin penalty, and the fact that they have also deleted the Google Analytics profiles for the site, how should I proceed? **To my mind I have 3 options. ** 1. Ignore the bad majority in the .co.uk backlink profile, keep up the change of address and 301's, and hope that we can just drown out the shoddy links by building new quality ones - to the .com. Hopefully the crufty links will fade into insignificance over time.. I'm not too keen on this course of action. 2. Use the disavow tool for every suspect link pointing to the .co.uk site (no way I will be able to get the links removed manually) - and the advice I've seen also suggests submitting a reinclusion request afterwards- but this seems pointless considering we are just 301-ing to the new (.com) site. 3. Disassociate ourselves completely from the .co.uk site - forget about the few quality links to it and cut our losses. Remove the change of address request in GWT and possibly remove the site altogether and return 410 headers for it just to force the issue. Clean slate in the post. What say you mozzers? Please help, working myself blue in the face to fix the organic traffic issues for this client and not getting very far as yet.
Technical SEO | | LukeHardiman0 -
Can you use aggregate review rich snippets on non-product pages?
It seems like the intended purpose of the aggregate review rich snippet is for an individual product page like a page for Madden 2013. However, what if you created a single page for all football video games that you sell and put reviews on this page for different games in this category. Could you still use the aggregate review markup for this page?
Technical SEO | | ProjectLabs0 -
Site Architecture Question on Ties.com - Navigation
I'm looking at the navigation structure of Ties.com. They have various categories like color, pattern, length, brand, etc. Once you click one of the main categories you get the option to "Narrow Your Choices". The structure starts like this: (URL 1) ties.com/black-ties Then when you narrow your search you get this: (URL 2) ties.com/animal-print**+**black-ties (notice + sign) My question: how does Google see URL 2? Is it just like any other link?
Technical SEO | | ErikDster0 -
Blogger Blog URL Structure Questions
I'm starting to use my blog more and wanted to ask about an issue I've read about on SEOmoz in the past. I use blogger instead of wordpress. It's quick and simple - I have no interest in switching to wordpress for this particular blog. My blog is currently setup as blog.site.com. Is it still important (for seo reasons) to switch from blog.site.com to site.com/blog? If so, is there a way to do this in blogger? And if I do this, will my past posts lose their authority if their redirected to the new url structure? Rand mentions in this article: http://www.seomoz.org/blog/11-best-practices-for-urls "never use multiple subdomains" - This is an old article, but I've seen this mentioned several times. Does this still hold true? Am I losing out on links to my blog? Thanks in advance.
Technical SEO | | ChaseH0