What is the Ideal Structure for User Generated Product Reviews on My Site?
-
I apologize for the lengthy post, but I need help!
Here is my current structure for product reviews:
My product pages displays a set number of user product reviews before displaying a link to "see all reviews". So:
Has product details, specs (usually generic from manufacturer) and 5 user product reviews. If there are more than 5, there is a link to see all reviews:
Where each page would display 10 user product reviews, and paginate until all user reviews are displayed.
I am thinking about using the Rel Canonical tag on the paginated reviews pages to reference back to the main product page. So:
Would have the canonical URL of:
Does this structure make sense? I'm unclear what strategy I should use, but currently the product review pages account for less than 2% of overall organic traffic.
Thanks ahead of time!
-
Thanks for the input Marcus!
-
Hey Will
If your current product page has variations but only varies based on the reviews it is showing then there is not really anything unique (bar the reviews) on these pages and the main content (product details) is the same.
Maybe something like what Amazon does:
1. Main Product page with some reviews or snippets
2. Reviews page (dynamic) which the primary content is the reviews
Then, the different review pages can rank on their own merit and all that user generated content does not go to waste.
You could use a static URL for your product page and a dynamic URL for your reviews page so it would be something like this:
/category/product-name.html
/category/product-name/reviews/?page=1
/category/product-name/reviews/?page=2
/category/product-name/reviews/?page=3
etcCheck out these amazon links:
product:
http://www.amazon.co.uk/Girl-Dragon-Tattoo-Millennium-Trilogy/dp/1847242537The amazon links are a bit crazy, but it is a sound concept overall.
Hope it helps!
Marcus
-
Hi Marcus,
Perhaps I'll need a few glasses myself to decipher your message? I kid I kid.
I believe the structure you are referring to is what I currently have. The main product page, and additional pages with paginated user reviews. The only difference is your example list static URLs for the paginated reviews vs using a page # parameter as I have.
And could you please clarify:
"if you use rel canonical back to your main product page you are losing the benefit of all of those additional reviews."
What would happen in a scenario such as:
- I'm a spider, crawling through your product review pages
- On the 2nd page, a very nice, useful, thorough product review
- That 2nd page rel canonicals back to the main product page
- There is a SE query matching the 2nd page product review exactly
Would the main product page be listed on the SERPs, or since there was a rel canonical URL of the main product page, it poofs and disappears altogether?
-
Hmm, it's a tricky one but surely, if you use rel canonical back to your main product page you are losing the benefit of all of those additional reviews.
Just spitballing here but would it not be better to have the main product page with the first five or so reviews on and then create unique, paginated pages for the product reviews with a summary of the product details (so the reviews were the primary content).
So, we would have
product-name.html
product-name-reviews-page1.html
product-name-reviews-page2.htmlThis way, you get lots of nice long tail potential from the additional review pages that summarise the product, show the unique reviews and make it VERY easy to link back to the main product page to buy?
Seems a shame to have loads of great user generated reviews and then stop yourself ranking for them. Just make the purpose of the page clear as reviews of and the path to the main page very clear so user A with concern X can have his fears allayed and click through to buy.
Had a few glasses of wine, so hope, that makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why do sites w/o structured data beat me for rich snippets?
I can't figure this out. For a number of search terms that I compete for, there are competitors that rank below me, but their pages are featured in a rich snippet. I wanted to see what kind of structured data these sites are providing, thinking maybe there's something I can learn. But when I run these URLs through Google's Structured Data Testing Tool, it tells me these pages contain no structured data! So how is it that Google think's my page is more relevant (I rank higher) and I have structured data, but Google chooses to feature a different page? Does anyone have ideas on how I can snag these rich snippets for myself?
Intermediate & Advanced SEO | | AlexLenhoff0 -
.co.uk and com: Independent sites, but owned buy us , sharing some product information
We have two sites .com and .co.uk. Both are selling sites and the .com sells in $ and .co.uk in £s.
Intermediate & Advanced SEO | | BruceA
75% of the text is from the .co.uk site and used on the .com site. Each site has 6000+ pages, 4000+ contain product descriptions that are identical. We have looked at canonical and hreflang, but neither seem to fix the problem of duplication issues. We can add into the product detail master page rel alternative, but this will not fix the other potential clashes on the other pages. Can anyone advise if we can add a site wide html to each site or one that will fix this. Many thanks0 -
Site re-design, full site domain A/B test, will we drop in rankings while leaking traffic
We are re-launching a client site that does very well in Google. The new site is on a www2 domain which we are going to send a controlled amount of traffic to, 10%, 25%, 50%, 75% to 100% over a 5 week period. This will lead to a reduction in traffic to the original domain. As I don't want to launch a competing domain the www2 site will not be indexed until 100% is reached. If Google sees the traffic numbers reducing over this period will we drop? This is the only part I am unsure of as the urls and site structure are the same apart from some new lower level pages which we will introduce in a controlled manner later? Any thoughts or experience of this type of re-launch would be much appreciated. Thanks Pete
Intermediate & Advanced SEO | | leshonk0 -
Regional and Global Site
We have numerous versions of what is basically the same site, that targets different countries, such as United States, United Kingdom, South Africa. These websites use Tlds to designate the region, for example, co.uk, co.za I believe this is sufficient (with a little help from Google Webmastertools) to convince the search engines what site is for what region. My question is how do we tell the search engines to send traffic from other regions besides the above to our global site, which would have a .com TLD. For example, we don't have a Brazilian site, how do we drive traffic from Brazil to our global .com site? Many thanks, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Spellcheck necessary for user generated content?
We have a lot of user generated reviews on our key landing pages. Matt Cutts recommended using correctly spelled content. Would you perform a spellcheck of all already published user reviews or would you leave already published reviews rather intact and only perform spellcheck for new reviews before they are published? Since reviews have been marked up using schema.org, I am not sure whether posterior editing of lots of reviews may raise a flag with google regarding manipulating reviews. Thanks.
Intermediate & Advanced SEO | | lcourse0 -
Why is Google Still Penalizing My Site?
We got hit pretty hard by Penguin. There were some bad link issues which we've cleared up and we also had a pretty unique situation stemming from about a year ago when we changed the name of the company and created a whole new site with similar content under a different URL. We used the same phone number and address, and left the old site up as it was still performing well. Google didn't care for that so we eventually used 301 redirects to push the link juice from the old site to the new site. That's the background, here's the problem...... We've partially recovered, but there are several keywords that haven't come back anywhere near where they were in Google. We have higher page rank and more links than our competition and are performing in the top 5 for some of our keywords. Other, similar keywords, where we used to be in the top 5, we are now down on page 4 or 5. Our website is www.hudsoncabinetrydesign.com. We build custom cabinetry and furniture in Westchester County, NY just north of NYC. Examples - For "custom built-ins new york" we are number 3 on Google, number 1 on Bing/Yahoo. For "custom kitchen cabinetry ny" we are number 3 on Bing/Yahoo, not in the top 50 on Google. For "custom radiator covers ny" we used to be #1 on Google, are currently #48, currently #2 on Bing/Yahoo. Obviously, we've done something to upset the Google, but we've run out of ideas as to what it could be. Any ideas as to what is going on? Thanks so much for your feedback, Doug B.
Intermediate & Advanced SEO | | doug_b0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90 -
Migrating a site with new URL structure
I recently redesigned a website that is now in WordPress. It was previously in some odd, custom platform that didn't work very well. The URL's for all the pages are now more search engine friendly and more concise. The problem is, now Google has all of the old pages and all of the new pages in its index. This is a duplicate problem since content is the same. I have set up a 301 redirect for every old URL to it's new counterpart. I was going to do a remove URL request in Webmaster Tools but it seems I need to have a 404 code and not a 301 on those pages to do that. Which is better to do to get the old URL's out of the index? 404 them and do a removal request or 301 them to the new URL? How long will it take Google to find these 301 redirects and keep just the new pages in the index?
Intermediate & Advanced SEO | | DanDeceuster0