Ok to use rich snippets for same product on multiple pages?
-
I am developing a new set of pages for a series of products which exist on separate sub domains linked to the root domain. The product pages on the sub domains have rich snippets; review count, review score etc. The new pages im building out are for the same products though on the root domain and with different content. Im not comfortable marking those pages up with rich snippets too given they will have the same review counts, scores etc though would like to if its viable? Any thoughts/opinions?
Thanks,
Andy
-
Why don't you move everything to the root domain?
You can keep all the subdomains in existence and put in 301 redirects from the sub-domain to the root domain and then any links that may be coming in you just ask to get them changed?
Then the search feature also is tweaked to only look at root domain?
This would simplify the site a lot and make it much easier to manage in the long term.
Just my opinion, but hope it helps.
P.S. Spend the money wisely, we're in a recession don't you know?
-
Hi Thanks for the reply.
The reason for the new pages for the old products is 2 fold really. 1) the sub domains dont rank as well as they could simply because they are sub domains, they get little link juice externally or internally and there is internal opposition (fear) to moving them to the root domain. The root domain is high authority, ranks well and so showcasing the products on the root domain is a great option. 2) People can use the product and use different search terms to find the product than is currently represented by the products on the sub domain e.g. a hotel in london home page can be optimised for hotel/accommodation keywords, it would also be useful to be found for 'short breaks in london' so you may chose to create a short breaks page. Same product, 2 different approaches and content required. (not actual example).
Ill take the 2p and spend it, im in the UK also.
-
My first question would be why are you showcasing the same products on different pages? Do you have 'featured products' or 'most popular' products or 'special offers', for example or is it something else.
After considering that, assuming there's a logical user-driven reason for doing it - I can't see a major problem if the rich snippets are consistent for each product.
Of course, depending on whether this is manual or automatic/CMS driven it will become a headache to keep product snippets updated in multiple locations so on that front I wouldn't encourage it. Get a balance though - if you can encourage click through from the root to the sub-domain then the snippets will be picked up anyway. Don't make running your site more hard work than it strictly necessary.
Just my £0.02p (I'm in the UK so I can't give you 2 cents ;))
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Use of "/" and using fractions in titles
We are a company that sells pipe and fittings. An example of a part that someone will search for is : 3/4" PVC Socket I am not sure how best to represent the fraction in the title of the page that has such a product. I am concerned that if I use the forward slash it will be misinterpreted by search engines (although it will be interpreted properly by users). A lot of folk search for the product by the fraction size and so it would be good to be able to represent it in the title, but I don't want to get "punished" by confusing search engines. I could replace the forward slash with a hyphen or pipe symbol, but then may look a bit weird to our users... Any recommendations? Bob
Intermediate & Advanced SEO | | BobBawden11 -
SEO for Product Pages Deal that will last One Day Only
For an Ecommerce website I am required to create two pages. 1) One that will be displaying the "Deal of the day", which is basically a summary of the product on sale and another 2) product page where the actual product-deal resides. "Deal of the day" page Fixed url e.g. homepage.com/deal-of-the-day Product description summary Go to product-deal & Buy Now Button Content changes everyday Product Deal Page Similar to other products, sometimes will be a group of products, coupons etc. Product deals will be stored for later re-use Not visible from the main product catalogue These products are most of the time the same products from the catalogue but different copy Recommendations? Thanks!
Intermediate & Advanced SEO | | raulreyes0 -
Ecommerce Site - Duplicate product descriptions & SKU pages
Hi I have a couple of questions regarding the best way to optimise SKU pages on a large ecommerce site. At the moment we have 2 landing pages per product - one is the primary landing page with no SKU, the other includes the SKU in the URL so our sales people & customers can find it when using the search facility on the site. The SKU landing page has a canonical pointing to the primary page as they're duplicates. Is this the best way? Or is it better to have the one page with the SKU in the URL? Also, we have loads of products with the very similar product descriptions, I am working on trying to include a unique paragraph or few sentences on these to improve the content - how dangerous is the duplicate content within your own site? I know its best to have totally unique content, but it won't be possible on a site with thousands of products and a small team. At the moment I am trying to prioritise the products to update. Thank you 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Should we show(to google) different city pages on our website which look like home page as one page or different? If yes then how?
On our website, we show events from different cities. We have made different URL's for each city like www.townscript.com/mumbai, www.townscript.com/delhi. But the page of all the cities looks similar, only the events change on those different city pages. Even our home URL www.townscript.com, shows the visitor the city which he visited last time on our website(initially we show everyone Mumbai, visitor needs to choose his city then) For every page visit, we save the last visited page of a particular IP address and next time when he visits our website www.townscript.com, we show him that city only which he visited last time. Now, we feel as the content of home page, and city pages is similar. Should we show these pages as one page i.e. Townscript.com to Google? Can we do that by rel="canonical" ? Please help me! As I think all of these pages are competing with each other.
Intermediate & Advanced SEO | | sanchitmalik0 -
Multiple Author Rich Snippets On A Q&A Forum Page
Hi, I work on a site that has a robust q&a forum. Members post questions and other members answer the questions. The answers can be lengthy, often by experts with Google+ pages and almost always by multiple member/commenters answering a particular question. Much like Moz's forum here. In order to get rich snippets results in search for a single Q&A page, what would happen if each of, for instance, 10 commenters on a page, were tagged as author? After all, the q/a forum pages have many authors, each as author of their own comments. Or, should I pick one comment out of many and call that member/commenter the author or something else? If it matters, the person asking the question in the forum is almost always not the expert providing a ton of detailed content. Also, a question might be 8 words. One answer might be 25 to 500 or more and their might be 5 to 10 different answers. Thanks! Cheers... Darcy
Intermediate & Advanced SEO | | 945010 -
Amount of pages indexed for classified (number of pages for the same query)
I've notice that classified usually has a lots of pages indexed and that's because for each query/kw they index the first 100 results pages, normally they have 10 results per page. As an example imagine the site www.classified.com, for the query/kw "house for rent new york" there is the page www.classified.com/houses/house-for-rent-new-york and the "index" is set for the first 100 SERP pages, so www.classified.com/houses/house-for-rent-new-york www.classified.com/houses/house-for-rent-new-york-1 www.classified.com/houses/house-for-rent-new-york-2 ...and so on. Wouldn't it better to index only the 1st result page? I mean in the first 100 pages lots of ads are very similar so why should Google be happy by indexing lots of similar pages? Could Google penalyze this behaviour? What's your suggestions? Many tahnks in advance for your help.
Intermediate & Advanced SEO | | nuroa-2467120 -
Category Pages up - Product Pages down... what would help?
Hi I mentioned yesterday how one of our sites was losing rank on product pages. What steps do you take to improve the SERPS of product pages, in this case home/category/product is the tree. There isn't really any internal linking, except one link from the category page to each product, would setting up a host of internal links perhaps "similar products" linking them together be a place to start? How can I improve my ranking of these more deeply internal pages? Not just internal links?
Intermediate & Advanced SEO | | xoffie0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0