User Reviews Question
-
On my e-commerce site, I have user reviews that cycle in the header section of my category pages. They appear/cycle via a snippet of code that the review program provided me with.
My question is...b/c the actual user-generated content is not in the page content does the google-bot not see this content? Does it not treat the page as having fresh content even though the reviews are new? Does the bot only see the code that provides the reviews?
Thanks in advance. Hopefully this question is clear enough.
-
Sure thing Jeff.
They can help in a few ways:
1. They add content to your pages (without you having to pay anything)
2. As you suggested, they can also send you some long tail traffic
3.Google can display the star rating of whatever it is you're reviewing via rich snippets:
http://www.dannyvince.net/wp-content/uploads/2012/03/engraved-gift-ideas-rich-snippets.jpg
However, they're a double-edged sword. If you're careful about keyword density and other on-page metrics (like I am), letting a bunch of people write the content of your site isn't the best idea.
I'm actually KILLING a competitor's site right now in the SERPs who have mostly UGC/reviews. A lot of this is because my on-page is leagues and bounds above theirs.
-
Thanks for the help Brian and Irving.
In what ways do those reviews help with SEO? Do they help with long-tail KW also?
-
Run a spider simulator to see what googlebot sees or look at the source code to see if the content is there.
Even if the content being added is not significant, if the content on the pages change often that will keep the bots coming back and respidering more often.
If it's random reviews populating randomly in the header though it won't really help for SEO. You need real reviews on that page that stay there and aggregate in order to help your SEO.
-
It depends on the way they're displayed, Jeff.
If the reviews are HTML, you bet your butt Googlebot can see them. If it's Java or Flash...that's another story.
Either way, that's not really the fresh content Google's looking for on page in my opinion.
When it comes to freshness, Google wants either a) significant amounts of content added to a site (ie. blog posts) or b) significant updates to existing content.
Either way, these reviews aren't likely to make much of a difference in terms of your rank.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question on Pagination - /blog/ vs /blog/?page=1
Question on Pagination Because we could have /blog/ or /blog/?page=1 as page one would this be the correct way to markup the difference between these two URL? The first page of a sequence could start with either one of these URLs. Clarity around what to do on this first page would be helpful. Example… Would this be the correct way to do this as these two URLs would have the exact content? Internal links would likely link to /blog/ so signal could be muddy. URL: https://www.somedomain.com/blog/
Technical SEO | | jorgensoncompanies
<link rel="canonical" href="https://www.somedomain.com/blog/?page=1"> URL: https://www.somedomain.com/blog/?page=1
<link rel="canonical" href="https://www.somedomain.com/blog/?page=1"> Google is now saying to just use the canonical to the correct paginated URL with page number. You can read that here:
https://developers.google.com/search/docs/advanced/ecommerce/pagination-and-incremental-page-loading But they do not clarify what to do on /blog/?page=1 vs /blog/ as they are the exact same thing. Thanks for your help.0 -
Subdomain Ranking Question
Hi All - Quick question that I think I know the answer to, but I feel like I've been going around in circles a bit. My client is launching a new product and wants us to build a microsite for it (product.clientname.com). My client really dislikes their brand website, and wants to use paid media to push their audience to this new microsite. However, they also said want it to rank well organically. I feel uneasy about this, because of the subdomain vs. subfolder argument. I believe that the product will also be listed/featured on their main brand website. What is the best way forward? Thanks!
Technical SEO | | AinsleyAgency0 -
SEO question: Need help on rel="alternate" hreflang="x"
Hi all, we have webcontent in 3 languages (official belgian yellow pages), we use a separate domain per language, these are also our brands.
Technical SEO | | TruvoDirectories
ex. for the restaurant Wagamamahttp://www.goudengids.be/wagamama-antwerpen-2018/ corresponds to nl-be
http://www.pagesdor.be/wagamama-antwerpen-2018/ corresponds to fr-be
http://www.pagesdor.be/wagamama-antwerpen-2018/ corresponds to en-be The trouble is that sometimes I see the incorrect urls appearing when doing a search in google, ex. when searching on google.be (dutch=nederlands=nl-be) I see the www.pagesdor.be version appearing (french) I was trying to find a fix for this within https://support.google.com/webmasters/answer/189077?hl=nl , but this only seems to apply to websites which use SUBdomains for language purposes. I'm not sure if can work for DOMAINS. Can anyone help me out? Kind regards0 -
Local Search: Technically optimised for Reviews & Stars, but not showing in SERPS
Hi, for over a year now we actively use schema.org into our yellow pages platform.
Technical SEO | | TruvoDirectories
Simultaneaously we managed to set up a review platform to attract more users to write reviews. We also monitor closely local search experts like (blumenthal and co 😉 ). So I learned in this post http://blumenthals.com/blog/2013/07/19/how-many-reviews-to-get-the-star-treatment-somewhere-between-4-and-5/ that it takes you 4-5 reviews to get the star treatment by Google. But at this moment, I cannot find any star treatment. For example on this listing http://www.goudengids.be/hollywok-kortrijk-kortrijk-8500/1/ you can notice the presence of 6 review (http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.goudengids.be%2Fhollywok-kortrijk-kortrijk-8500%2F1%2F) but in Google itself it is not displayed as such. So my question is: in your experience, are there any other parameters that will trigger the stars to appear?0 -
Google Change of Address with Questionable Backlink Profile
We have a .com domain where we are 301-ing the .co.uk site into it before shutting it down - the client no longer has an office in the UK and wants to focus on the .com. The .com is a nice domain with good trust indicators. I've just redesigned the site, added a wad of healthy structured markup, had the duplicate content mostly rewritten - still finishing off this job but I think we got most of it with Copyscape. The site has not so many backlinks, but we're working on this too and the ones it does have are natural, varied and from trustworthy sites. We also have a little feature on the redesign coming up in .Net magazine early next year, so that will help. The .co.uk on the other hand has a fair few backlinks - 1489 showing in Open Site Explorer - and I spent a good amount of time matching the .co.uk pages to similar content on the .com so that the redirects would hopefully pass some pagerank. However, approximately a year later, we are struggling to grow organic traffic to the .com site. It feels like we are driving with the handbrake on. I went and did some research into the backlink profile of the .co.uk, and it is mostly made up of article submissions, a few on 'quality' (not in my opinion) article sites such as ezine, and the majority on godawful and broken spammy article sites and old blogs bought for seo purposes. So my question is, in light of the fact that the SEO company that 'built' these shoddy links will not reply to my questions as to whether they received a penalty notification or noticed a Penguin penalty, and the fact that they have also deleted the Google Analytics profiles for the site, how should I proceed? **To my mind I have 3 options. ** 1. Ignore the bad majority in the .co.uk backlink profile, keep up the change of address and 301's, and hope that we can just drown out the shoddy links by building new quality ones - to the .com. Hopefully the crufty links will fade into insignificance over time.. I'm not too keen on this course of action. 2. Use the disavow tool for every suspect link pointing to the .co.uk site (no way I will be able to get the links removed manually) - and the advice I've seen also suggests submitting a reinclusion request afterwards- but this seems pointless considering we are just 301-ing to the new (.com) site. 3. Disassociate ourselves completely from the .co.uk site - forget about the few quality links to it and cut our losses. Remove the change of address request in GWT and possibly remove the site altogether and return 410 headers for it just to force the issue. Clean slate in the post. What say you mozzers? Please help, working myself blue in the face to fix the organic traffic issues for this client and not getting very far as yet.
Technical SEO | | LukeHardiman0 -
Local Hotel Reviews/Citations and the ISP address
Happy New Year Mozzers, If a hotel guest signs into the free wireless provider that is provided by the hotel and writes a review of the hotel while logged in, by chance will Google perceive this as hotel staff writing phony reviews to hype the hotel? I ask this because I wonder if Google will view the ISP address as the hotel ISP and not differentiate a hotel guest versus the hotel staff. Thanks everyone!
Technical SEO | | hawkvt10 -
Is this dangerous (a content question)
Hi I am building a new shop with unique products but I also want to offer tips and articles on the same topic as the products (fishing). I think if was to add the articles and advice one piece at a time it would look very empty and give little reason to come back very often. The plan, therefore, is to launch the site pulling articles from a number of article websites - with the site's permission. Obviously this would be 100% duplicate content but it would make the user experience much better and offer added value to my site as people are likely to keep returning even when not in the mood to purchase anything; it also offers the potential for people to email links to friends etc. note: over time we will be adding more unique content and slowly turning off the pulled articled. Anyway, from an seo point of view I know the duplicate content would harm the site but if I was to tell google not to index the directory and block it from even crawling the directory would it still know there is duplicate content on the site and apply the penalty to the non duplicate pages? I'm guessing no but always worth a second opinion. Thanks Carl
Technical SEO | | Grumpy_Carl0 -
Up-to-date list of search engine bot user agents
Does anyone know of an up-to-date-list of search engine bot user agents? Thanks.
Technical SEO | | JoeAmadon0