Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Product schema GSC Error 'offers, review, or aggregateRating should be specified'
-
I do not have a sku, global identifier, rating or offer for my product. Nonetheless it is my product. The price is variable (as it's insurance) so it would be inappropriate to provide a high or low price. Therefore, these items were not included in my product schema. SD Testing tool showed 2 warnings, for missing sku and global identifier.
Google Search Console gave me an error today that said: 'offers, review, or aggregateRating should be specified'
I don't want to be dishonest in supplying any of these, but I also don't want to have my page deprecated in the search results. BUT I DO want my item to show up as a product. Should I forget the product schema? Advice/suggestions?
Thanks in advance.
-
Really interested to see that others have been receiving this too, we have been having this flagged on a couple of sites / accounts over the past month or two
Basically, Google Data Studio's schema error view is 'richer' than that of Google's schema tool (stand-alone) which has been left behind a bit in terms of changing standards. Quite often you can put the pages highlighted by GSC (Google Search Console) into Google's schema tool, and they will show as having warnings only (no errors) yet GSC says there are errors (very confusing for a lot of people)
Let's look at an example:
- https://d.pr/i/xEqlJj.png (screenshot step 1)
- https://d.pr/i/tK9jVB.png (screenshot step 2)
- https://d.pr/i/dVriHh.png (screenshot step 3)
- https://d.pr/i/X60nRi.png (screenshot step 4)
... basically the schema tool separates issues into two categories, errors and warnings
But Google Search Console's view of schema errors, is now richer and more advanced than that (so adhere to GSC specs, not schema tool specs - if they ever contradict each other!)
What GSC is basically saying is this:
"Offers, review and aggregateRating are recommended only and usually cause a warning rather than an error if omitted. However, now we are taking a more complex view. If any one of these fields / properties is omitted, that's okay but one of the three MUST now be present - or it will change from an warning to an error. SO to be clear, if one or two of these is missing, it's not a big deal - but if all three are missing, to us at Google - the product no longer constitutes as a valid product"
So what are the implications of having schema which generates erroneous, invalid products in Google's eyes?
This was the key statement I found from Google:
Google have this document on the Merchant Center (all about Google Shopping paid activity): https://support.google.com/merchants/answer/6069143?hl=en-GB
They say: "Valid structured markup allows us to read your product data and enable two features: (1) Automatic item updates: Automatic item updates reduce the risk of account suspension and temporary item disapproval due to price and availability mismatches. (2) Google Sheets Merchant Center add-on: The Merchant Center add-on in Google Sheets can crawl your website and uses structured data to populate and update many attributes in your feed. Learn more about using Google sheets to submit your product data. Prevent temporary disapprovals due to mismatched price and availability information with automatic item updates. This tool allows Merchant Center to update your items based on the structured data on your website instead of using feed-based product data that may be out of date."
So basically, without 'valid' schema mark-up, your Google Shopping (paid results) are much more likely to get rejected at a higher frequency, as Google's organic crawler passes data to Google Shopping through schema (and assumedly, they will only do this if the schema is marked as non-erroneous). Since you don't (well, you haven't said anything about this) use Google Shopping (PLA - Product Listing Ads), this 'primary risk' is mostly mitigated
It's likely that without valid product schema, your products will not appear as 'product' results within Google's normal, organic results. As you know, occasionally product results make it into Google's normal results. I'm not sure if this can be achieved without paying Google for a PLA (Product Listings Ad) for the hypothetical product in question. If webmasters can occasionally achieve proper product listings in Google's SERPs without PLA, e.g like this:
https://d.pr/i/XmXq6b.png (screenshot)
... then be assured that, if your products have schema errors - you're much less likely to get them listed in such a way for for free. In the screenshot I just gave, they are clearly labelled as sponsored (meaning that they were paid for). As such, not sure how much of an issue this would be
For product URLs which rank in Google's SERPs which do not render 'as' products:
https://d.pr/i/aW0sfD.png (screenshot)
... I don't think that such results would be impacted 'as' highly. You'll see that even with the plain-text / link results, sometimes you get schema embedded like those aggregate product review ratings. Obviously if the schema had errors, the richness of the SERP may be impacted (the little stars might disappear or something)
Personally I think that this is going to be a tough one that we're all going to have to come together and solve collectively. Google are basically saying, if a product has no individual review they can read, or no aggregate star rating from a collection of reviews, or it's not on offer (a product must have at least one of these three things) - then to Google it doesn't count as a product any more. That's how it is now, there's no arguing or getting away from it (though personally I think it's pretty steep, they may even back-track on this one at some point due to it being relatively infeasible for most companies to adopt for all their thousands of products)
You could take the line of re-assigning all your products as services, but IMO that's a very bad idea. I think Google will cotton on to such 'clever' tricks pretty quickly and undo them all. A product is a product, a service is a service (everyone knows that)
Plus, if your items are listed as services they're no longer products and may not be eligible for some types of SERP deployment as a result of that
The real question for me is, why is Google doing this?
I think it's because, marketers and SEOs have known for a long time that any type of SERP injection (universal search results, e.g: video results, news results, product results injected into Google's 'normal' results) are more attractive to users and because people 'just trust' Google they get a lot of clicks
As such, PLA (Google Shopping) has been relatively saturated for some time now and maybe Google feel that the quality of their product-based results, has dropped or lowered in some way. It would make sense to pick 2-3 things that really define the contents of a trustworthy site which is being more transparent with its user-base, and then to re-define 'what a product is' based around those things
In this way, Google will be able to reduce the amount of PLA results, reduce the amount of 'noise' they are generating and just keep the extrusions (the nice product boxes in Google's SERPs) for the sites that they feel really deserve them. You might say, well if this could result in their PLA revenue decreasing - why do it? Seems crazy
Not really though, as Google make all their revenue from the ads that they show. If it becomes widely known that Google's product-related search results suck, people will move away from Google (in-fact, they have often quoted Amazon as being their leading competitor, not another search engine directly)
People don't want to search for website links any more. They want to search for 'things'. Bits of info that pop out (like how you can use Google as a calculator or dictionary now, if you type your queries correctly). They want to search for products, items, things that are useful to them
IMO this is just another step towards that goal
Thank you for posting this question as it's helped me get some of my own thoughts down on this matter
-
I had a similar issue as we offer SaaS solutions with various different prices.
How I resolved this problem was by changing the Entity Type from Product to Service. Then you no longer need Sku or product related parameters.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get rid of bot verification errors
I have a client who sells highly technical products and has lots and lots (a couple of hundred) pdf datasheets that can be downloaded from their website. But in order to download a datasheet, a user has to register on the site. Once they are registered, they can download whatever they want (I know this isn't a good idea but this wasn't set up by us and is historical). On doing a Moz crawl of the site, it came up with a couple of hundred 401 errors. When I investigated, they are all pages where there is a button to click through to get one of these downloads. The Moz error report calls the error "Bot verification". My questions are:
Technical SEO | | mfrgolfgti
Are these really errors?
If so, what can I do to fix them?
If not, can I just tell Moz to ignore them or will this cause bigger problems?0 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
How do I customize Magento product urls?
I would like my product urls to be /category/manufacturer/name/part#. This would be the only url the item uses and how the product is accessed. It would also be used for product feeds. My first attempt was to use https://amasty.com/magento-unique-product-url.html This creates a single url but I can not customize it. Sometimes it selects the manufacturer and sometimes the category. My second attempt was with https://www.magentocommerce.com/magento-connect/custom-product-urls-seo.html I have it installed but it doesn't change the urls. Has anyone been able to do this successfully?
Technical SEO | | Tylerj0 -
Product Variations (rel=canonical or 301) & Duplicate Product Descriptions
Hi All, Hoping for a bit of advice here please, I’ve been tasked with building an e-commerce store and all is going well so far. We decided to use Wordpress with Woocommerce as our shop plugin. I’ve been testing the CSV import option for uploading all our products and I’m a little concerned on two fronts: - Product Variations Duplicate content within the product descriptions **Product Variations: - ** We are selling furniture that has multiple variations (see list below) and as a result it creates c.50 product variations all with their own URL’s. Facing = Left, Right Leg style = Round, Straight, Queen Ann Leg colour = Black, White, Brown, Wood Matching cushion = Yes, No So my question is should I 301 re-direct the variation URL’s to the main product URL as from a user perspective they aren't used (we don't have images for each variation that would trigger the URL change, simply drop down options for the user to select the variation options) or should I add the rel canonical tag to each variation pointing back to the main product URL. **Duplicate Content: - ** We will be selling similar products e.g. A chair which comes in different fabrics and finishes, but is basically the same product. Most, if not all of the ‘long’ product descriptions are identical with only the ‘short’ product descriptions being unique. The ‘long’ product descriptions contain all the manufacturing information, leg option/colour information, graphics, dimensions, weight etc etc. I’m concerned that by having 300+ products all with identical ‘long’ descriptions its going to be seen negatively by google and effect the sites SEO. My question is will this be viewed as duplicate content? If so, are there any best practices I should be following for handling this, other than writing completely unique descriptions for each product, which would be extremely difficult given its basically the same products re-hashed. Many thanks in advance for any advice.
Technical SEO | | Jon-S0 -
Strange URL's for client's site
We just picked up a new client and I've been doing some digging around on their site. They have quite the wide variety of URL's that make for a rather confusing experience. One of the milder examples is their "About" page. Normally I would expect something along the lines of: www.website.com/about I see: www.website.com/default.asp?Page=About I'm typically a graphic designer and know basically nothing about code, but I just assume this has something funky to do with how their website was constructed. I'm assuming this isn't particularly SEO friendly, but it doesn't seem too bad. Until I got to another section of their site. It's a section that logically should look like: www.website.com/training/public-seminars It's: www.website.com/default.asp?Page=MT&Area=Seminars&Sub=MRM Now that's nonsensical to me! Normally if a client has terrible URL's, I'd say let's do some redirects, but I guess I'm a little intimidated by these. Do the URL's have to be structured like this for some reason? Am I missing some important area of coding here? However, the most bizarre example is a link back to their website from yellowpages.com. Where normally I would expect it to lead to their homepage, I get this bizarre-looking thing: http://website1-px.rtrk.com/?utm_source=ReachLocal&utm_medium=PPC&utm_campaign=AssetManagement&reference_id=15&publisher=yellowpages&placement=ypwebsitemip&action_target=listing_website And as you browse through the site, that strange domain stays. For example the About page is now: http://website1-px.rtrk.com/default.asp?Page=About I would try to google this but I have no idea where to even start! What is going on with these links? Will we be able to fix them to something presentable without breaking their website?
Technical SEO | | everestagency0 -
How do I handle duplicate content of the same product in Multiple product categories?
I am building a BigCommerce store for selling framed art. Many of the pieces of art will fall in more than one product category. Let's say I have a framed print of a photograph of a western landscape. This piece of art would fit into these categories; "western", "landscape", and "photography". I would have three pages with duplicate content for just this one framed print. Will google give me less page rank due to this? Can all the link juice be given to just one of the three categories by use of rel=canonical? If so, does anyone know how to do this for a bigcommerce site? I would appreciate any feedback. Thanks, Kelly
Technical SEO | | Kelly_S0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Error: Missing Meta Description Tag on pages I can't find in order to correct
This seems silly, but I have errors on blog URLs in our WordPress site that I don't know how to access because they are not in our Dashboard. We are using All in One SEO. The errors are for blog archive dates, authors and just simply 'blog'. Here are samples: http://www.fateyes.com/2012/10/
Technical SEO | | gfiedel
http://www.fateyes.com/author/gina-fiedel/
http://www.fateyes.com/blog/ Does anyone know how to input descriptions for pages like these?
Thanks!!0