If my products aren't showing in rich snippets, is there still value in adding product schema?
-
I'm adding category pages for an online auction site and trying to determine if its worth marking up the products listed on the page. All of the individual product pages have product schema, but I have never seen them show up in rich snippets likely due to the absence of the price element and the unique nature of the items. Is there still value in adding the product schema even if the items won't show in rich snippets?
Also, is it possible the product schema will help optimize for commerce related keywords such as [artist name] + for sale?
-
Yes. Google have said many many times (too many to cite) that schema, J-son and using the data highlighter is one of the best ways that you can spend your time optimising. You're making their lives easier and telling them directly how they can enrich the user experience for their searchers.
I have tested a number of different ways. I have rich snippet plugins for review stars. These are great and if I add the google review widget to a page then the star rating of that product will show almost immediately (we have 230ish google reviews at 4.9.)
Then I discovered the data highlighter and found that this works really well for individual products and getting reviews (the words not the stars) in the results and also helps with all sorts of other things. So each new page gets the full treatment from the data highlighter including local business, reviews, authors (if it's an article) and products. You need to include the prices and availability and if these are not available on the page to highlight then use the 'add extra info' function in the data highlighter and just say that the products are immediately available and you can add a price. This can be a 'from price' or a range. But without the price you're unlikely to get anywhere in my experience. Experiment with adding prices. If it's an auction add the price as "zero" or "$0" and just see whether you get featured.
Its a very responsive and fast system so you don't have to wait, just add the markup, send in the spiders and wait a few hours. Lately google has been taking longer to index so it might take a day or so but it's still a short enough timeframe to run tests. Also if you're using shopify or wordpress then you should be appearing with site-links, hyperlinks in the SERP and reviews specific to your product anyway if you're using H1's and H2's correctly and structuring the pages properly.
Then lastly there's just adding in the code to the page in the header / footer. I've found that this trumps everything. For example If I say i've got 230 reviews in the footer, then my little widget (that updates my reviews each day will not update and i'll still be stuck on 230 even when the widget is saying more. When I go in and update the code in the header it will update in a day or so. So it seems that manually adding the code is the firmest and strongest (or most trustworthy) signal.
Also remember that google has to trust your site to serve up rich results so newer sites, dodgy claims or anything other then whiter than white-hat is going to end you up with nothing showing up. Google are always trying to verify markup and if they catch you out exaggerating or being inaccurate then it's going to cause problems. They've said this many times too. It's classed as 'markup spam'.
So get highlighting or get yourself a code-builder to make some J-son or code you can insert directly onto the product pages. I'm not a developer but I do it with a tool called SEO Profiler where you just type the words and it turns it into J-son for the site by magic and you just paste it in there. There's free versions of this on the schema.org site. Also check this out from moz on schema
You will start showing just give it time. And mark up EVERYTHING you can. Always worth it, but be comprehensive or it won't show. So add in the best guess price and the best guess 'anything' that's missing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving to https with a bunch of redirects my programmer can't handle
Hi Mozzers, I referred a client of mine (last time) to a programmer that can transition their site from http to https. They use a wordpress website and currently use EPS Redirects as a plugin that 301 redirects about 400 pages. Currently, the way EPS redirects is setup (as shown in the attachment) is simple: On the left side you enter your old url, and on the the right side is the newly 301'd url. But here's the issue, since my client made the transition to https, the whole wordpress backend is setup that way as well. What this means is, if my client finds another old http url that he wants to redirect, this plugin only allows them to redirect https to https. As of now, all old http to https redirects STILL work even though the left side of the plugin switched all url's to a default HTTPS. But my client is worried the next plugin update he will lose all http to https redirects. While asking our programmer to add all 400 redirects to .htaccess, he states that's too many redirects and could slow down the website. Well, we don't want to lose all 400 301's and jeopardize our SEO. Question: what does everyone suggest as an alternative solution/plugin to redirect old http urls to https and future https to https urls? Thank you all! Ol8km
Intermediate & Advanced SEO | | Shawn1240 -
Ranking for keyword I don't optimize for & Other oddities
Hi Moz Community! I've been working with a clients website for about a year now. They were hit with the original Panda update because of some spammy links from a shady SEO firm. We've made a decent climb back but not a full recovery. There are some weird things happening that I would love some insight into. 1. Ranking for keywords we don't optimize for: I noticed some low keyword volume for a keyword term that is close to our main term, but is slightly different. We don't optimize for this term at all on our website. We rank third for this term, and actually show site links in the result, which doesn't happen for any of our other pages. 2. Index not found when doing site: search: Other oddity is that when you search site:www.mywebsite.com, I see all the pages within the site except the homepage. Not sure whats going on here, but when I fetch the homepage in GWMT, it returns the homepage. When you query the homepage by itself, it also ranks. Any help would be appreciated! Regards, J
Intermediate & Advanced SEO | | artscienceweb0 -
Local SEO - Do I need it if I don't do business locally?
Super confused about this. Our office is located in Los Angeles, but it is not a storefront, and our clients are from all over the country... and our business involves travel to other countries. So there is nothing "local" about us. But everything I read seems to say we should be doing local SEO. How to approach this?
Intermediate & Advanced SEO | | benenjerry1 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
18,000 'Title Element is too Long' Errors
How detrimental is this in the overall SEO scheme of things? Having checked 3 of our main competitors, they too seem to have similar issues... I am trying to look at a solution but it is proving very difficult! Thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
How would the rich snippets be treated in AJAX website?
Hi guys We have started to rewrite our website http://www.edamam.com on AJAX, and the idea is to have all the website on AJAX in the next few months. Although it would probably be difficult to index even with the Google Crawling protocol, and some other issues might appear, the engineers insist that from technology point of view this is the best way to go. We have already rewritten the internal search result pages, e.g. http://www.edamam.com/recipes/pasta and last week we set the Google Crawling protocol for AJAX to some of the individual recipe pages to test it. I'd like to ask for you opinion on whether the rich snippets we have in the search results will be affected by this change? Are there specific actions we need to take to preserve them? What other hot tips you have for dealing with AJAX on any level of the website? Thanks in advance Lily
Intermediate & Advanced SEO | | wspwsp0 -
Rich Snippets Ratings For Q&A Discussions, Articles,
Hi, I'm looking for how I can use a star rating for a q&a discussion or article/blog post to achieve a rich snippets search result. I'm thinking about a user rating for "Was this helpful?" 1 to 5 stars. As I look at schema.org and do and other reading on it, it looks like it's possible to rate only a set group of content types, blogs and discussions not included. However, I've seen rich snippets ratings in SERPs for blog posts, like this example https://www.google.com/search?q=erp+implementation+challenges&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a#q=panorama+consulting+blog&client=firefox-a&hs=gId&hl=en&rls=org.mozilla:en-US:official&ei=QmCBUYLLCOfwiwKHhIAQ&start=20&sa=N&bav=on.2,or.r_cp.r_qf.&bvm=bv.45921128,d.cGE&fp=eb2f15e2a98a4631&biw=2144&bih=995 On page, it looks like they used some simple span tags. So, my question is, which content type category does that fit into for rating and is that strategy safe enough going forward? Also, are there more steps to making this work? It it is okay to have users rate the helpfulness of a discussion or article and get rich snippets, I'd kinda like to do it. Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Robots.txt file - How to block thosands of pages when you don't have a folder path
Hello.
Intermediate & Advanced SEO | | Unity
Just wondering if anyone has come across this and can tell me if it worked or not. Goal:
To block review pages Challenge:
The URLs aren't constructed using folders, they look like this:
www.website.com/default.aspx?z=review&PG1234
www.website.com/default.aspx?z=review&PG1235
www.website.com/default.aspx?z=review&PG1236 So the first part of the URL is the same (i.e. /default.aspx?z=review) and the unique part comes immediately after - so not as a folder. Looking at Google recommendations they show examples for ways to block 'folder directories' and 'individual pages' only. Question:
If I add the following to the Robots.txt file will it block all review pages? User-agent: *
Disallow: /default.aspx?z=review Much thanks,
Davinia0