Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Benefits of Rich Snippets for financial products
-
Does anyone have experience of using rich snippets for non-physical products?
Our website offers credit cards comparison service. Do you think that tagging each card's page with rich snippets such as credit card image, name, description and category makes sense?
The idea is to make it stand out in the search results.
-
Thanks for your tips, Tom!
Really useful piece of advice.
We will definitely look into creating a customer reviews box and turning it into a rich snippet.
By any chance do you have experience with product markups?
There's an option there to add a product image. If we do it, will it appear in SERPS?
-
I think there's potential for rich snippets to help drive click throughs and perhaps make the user more likely to convert for these products.
If your site is a credit card comparison site, it may be worth linking the review with a rel=author tag of the person who added commentary and analysis, as well as giving the user the key details, such as rates, benefits etc. I'm reminded of Martin Lewis of Money Saving Expert (moneysavingexpert.com) - I can imagine a credit card comparison and review from him being linked via rel=author to make his face appear in the SERPs. That would be an appropriate use that would drive click throughs - while also giving the link a bit of authority and gravitas, as Martin Lewis is respected in his field here in the UK.
If you have individual product pages for each credit card, containing rates and an analysis, you could implement the star-rating rich snippet. Look at this Google search and scroll to the Aviva result to see what I mean.
Those are a couple of ways I can see rich snippets being used. So long as they don't appear to be 'forced' or manipulative, I'd say use as many as you can, as they can dramatically increase click through rates. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My product category pages are not being indexed on google can someone help?
My website has been indexed on google and all of its pages can be found on google except for the product category pages - which are where we want our traffic heading to, so this is a big problem for us. Our website is www.skirtinguk.com And an example of a page that isn't being indexed is https://www.skirtinguk.com/product-category/mdf-skirting-board/
Intermediate & Advanced SEO | | chelseaskirtinguk0 -
Inactive Products - Inactive URLs
Hi, In our website www.viatrading.com we have many products that might be in stock or not depending on availability. Until now, when a product was not available anymore, we took this page down (and redirected to its product category page). And, only if the product was available again, we re-activated the URL - this might be days, months or even years later. To make this more SEO-friendly, we decided now that while a product is not available, instead or deactivating/redirecting the page, we will leave it online and just add a message saying "This product is currently not available". If we do this, we will automatically re-activate about 500 products pages at once. 1. Just to make sure, is it harmful for SEO to keep activating/deactivating URLs this way? 2. Since most of these pages have been deindexed for a long time due to being redirected - have they lost all their SEO juice? 3. How can we better activate these old 500 pages - is it ok activating them all at once? Thank you,
Intermediate & Advanced SEO | | viatrading11 -
The benefits from having a dedicated IP
Is the true? Claim by SiteGround Having a dedicated IP for each website is considered by some experts as an advantage for search engine optimization. There is a common believe that sites with dedicated IP addresses do better in the search engine results than those on shared IPs. Such sites do not share the risk of being banned for sharing the same IP in case another website hosted on the same server gets banned by a search engine.
Intermediate & Advanced SEO | | JordanBrown0 -
When removing a product page from an ecommerce site?
What is the best practice for removing a product page from an Ecommerce site? If a 301 is not available and the page is already crawled by the search engine A. block it out in the robot.txt B. let it 404
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Ecommerce Duplicate Product Descriptions across 3 websites
Hi, We are an e commerce company that has our own domain but also sell the same products on eBay and Amazon. What is the feeling on the same exact descriptions being used on different platforms? Do they count as duplicate content? Will our domain be punished/penalised as our domain does not have as much authority as EBay or Amazon? We have over 5,000 products with our own hand written product descriptions. We want our website to be the main place/ have priority over the above market places. What's the best suggestion/solution? thanks,
Intermediate & Advanced SEO | | Roy19730 -
E-commerce site, one product multiple categories best practice
Hi there, We have an e-commerce shopping site with over 8000 products and over 100 categories. Some sub categories belong to multiple categories - for example, A Christmas trees can be under "Gardening > Plants > Trees" and under "Gifts > Holidays > Christmas > Trees" The product itself (example: Scandinavian Xmas Tree) can naturally belong to both these categories as well. Naturally these two (or more) categories have different breadcrumbs, different navigation bars, etc. From an SEO point of view, to avoid duplicate content issues, I see the following options: Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking. Use the same URL and display only one "main" version of breadcrumbs and menus. Possibly add the other "not main" categories as links to the category / product page. Use a different URL based on where we came from and do nothing (will create essentially the same content on different urls except breadcrumbs and menus - there's a possibiliy to change the category text and page title as well) Use a different URL based on where we came from with different menus and breadcrumbs and use rel=canonical that points to the "main" category / product pages This is a very interesting issue and I would love to hear what you guys think as we are finalizing plans for a new website and would like to get the most out of it. Thank you all!
Intermediate & Advanced SEO | | arikbar0 -
Max # of Products / Links per Page on E-Commerce Site
We are getting ready to re-launch our e-commerce site and are trying to decide how many products to list per category page. Some of of our category pages have upwards of 100 products. While I'd love to list ALL the products on the root category page (to reduce hassle for customer, to index more products on a higher PR page), I'm a little worried about having it be too long, and containing too many on-page links. Would love some guidance on: Maximum number of internal links on a page If Google frowns on really long category pages Anything else I should be considering when making this decision Thanks for your input!
Intermediate & Advanced SEO | | AndrewY2 -
How to prevent Google from crawling our product filter?
Hi All, We have a crawler problem on one of our sites www.sneakerskoopjeonline.nl. On this site, visitors can specify criteria to filter available products. These filters are passed as http/get arguments. The number of possible filter urls is virtually limitless. In order to prevent duplicate content, or an insane amount of pages in the search indices, our software automatically adds noindex, nofollow and noarchive directives to these filter result pages. However, we’re unable to explain to crawlers (Google in particular) to ignore these urls. We’ve already changed the on page filter html to javascript, hoping this would cause the crawler to ignore it. However, it seems that Googlebot executes the javascript and crawls the generated urls anyway. What can we do to prevent Google from crawling all the filter options? Thanks in advance for the help. Kind regards, Gerwin
Intermediate & Advanced SEO | | footsteps0