Why Doesn't All Structured Data Show in Google Webmaster?
-
We have more than 80k products, each of them with data-vocabulary.org markup on them, but only 17k are being reported as having the markup in Google Webmaster (GW). If I run a page that GW isn't showing as having the structure data in the structured data testing tool (http://www.google.com/webmasters/tools/richsnippets), it passes. Any thoughts on why this would be happening? Is it because we should switch from data-vocabulary.org to schema.org?
- Example of page that GW is reporting that has structured data: https://www.etundra.com/restaurant-equipment/refrigeration/display-cases/coutnertop/vollrath-40862-36-inch-cubed-glass-refrigerated-display-cabinet/
- Example of page that isn't showing in GW as having structured data: https://www.etundra.com/kitchen-supplies/cutlery/sandwich-spreaders/mundial-w5688-4-and-half-4-and-half-sandwich-spreader/
-
I would strongly recommend using schema creator or Google data-highlighter or microdatagenerator.com
http://www.google.com/webmasters/tools/richsnippets?q=uploaded:8004fd1b864bd1586879eaf3856d1a1c
need to get rid of the outdated data vocabulary
http://data-vocabulary.org/product outdated
for
http://schema.org/product works on everything up to date
you want the schema to look like this below
the format that is universal on all search engines is able to be created with either
schema–creator.com
or
Manufactured by: itemprop="name" itemprop="name">Vollrath
Model: 40862itemprop="description">Product ID: sku:VOL40862
http://schema-creator.org/ works on Google , bing & Yahoo
http://blog.raventools.com/an-seos-guide-to-schema-org/
http://schema-creator.org/product.php works on Google , bing & Yahoo
http://googlewebmastercentral.blogspot.com/2012/12/introducing-data-highlighter-for-event.html
http://www.microdatagenerator.com/ works on Google , bing & Yahoo
https://support.google.com/webmasters/answer/146750?hl=en works on Google
http://www.productontology.org/
http://moz.com/blog/schema-examples
http://www.seerinteractive.com/blog/googles-data-highlighter-and-a-view-into-the-future-of-seo
http://moz.com/webinars/microformats-real-life-use-cases
http://searchengineland.com/see-entities-web-page-tools-help-194710
http://www.bing.com/webmaster/help/products-and-offers-markup-138d0977
or
itemscope itemtype="http://schema.org/Product" itemscope="" itemtype="http://schema.org/Product"> 36" Cubed Glass Refrigerated Display CabinetVollrathitemprop="brand" itemscope itemtype="http://schema.org/Brand" itemprop="manufacturer" itemscope="" itemtype="http://schema.org/Organization"> Manufactured by: itemprop="name" itemprop="name">VollrathModel: 40862itemprop="description">Product ID: sku:VOL408625 based on 5 reviewsitemprop="offers" itemscope itemtype="http://schema.org/Offer" itemprop="offers" itemscope="" itemtype="http://schema.org/Offer">itemprop="price" itemprop="price">3,915.80 New [ type: http://schema.org/product property: | url: | Refrigerated Display Cabinet Vollrath |
| url: | 36" Cubed Glass Refrigerated Display Cabinet |
| name: | 36" Cubed Glass Refrigerated Display Cabinet |
| brand: | Item 1 |
| brand: | Item 2 |
| model: | 40862 |
| description: | Product ID: sku:VOL40862 |
| productid: | sku:VOL40862 |
| aggregaterating: | Item 3 |
| offers: | Item 4 |
| name: | 36 |
| image: | http://schema.org/Product |
| url: | https://www.etundra.com/images/products/372x372/restaurant-equipment/refrigeration/display-cases/coutnertop/vollrath-40862-36-inch-cubed-glass-refrigerated-display-cabinet/vol40862-1.jpg ||](https://www.etundra.com/images/products/372x372/restaurant-equipment/refrigeration/display-cases/coutnertop/vollrath-40862-36-inch-cubed-glass-refrigerated-display-cabinet/vol40862-1.jpg) P1gtMht.png ke18lB0.png
-
When was the last crawl report run? Could be that GWT has only crawled less than half of the pages since it was submitted, and therefore giving you that result (80k is a lot to crawl). As to data vocabulary vs schema, schema is becoming the standard. Even though you are using data vocabulary, it looks like the markup is working according to the rich snippet testing tool. Schema also has more extensive vocabularies
GWT is not perfect, there will be errors in the reporting. If you can see that its working, more than likely it is.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Switching from HTTP to HTTPS and google webmaster
HI, I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well. Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all. I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file?? Any help and advice would be much appreciated. Kind regards Steve User-agent: *
Technical SEO | | lqz
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 ceLAHIv.jpg0 -
Why isn't our new site being indexed?
We built a new website for a client recently. Site: https://www.woofadvisor.com/ It's been live for three weeks. Robots.txt isn't blocking Googlebot or anything. Submitted a sitemap.xml through Webmasters but we still aren't being indexed. Anyone have any ideas?
Technical SEO | | RobbieD910 -
Sitemap errors have disappeared from my Google Webmaster tools
Hi all, A week ago I had 66 sitemap errors related to href langs in my GWT. Now, all the errors are gone, and it shows no errors. We have not done any work to fix the errors. I wonder if anybody has experienced the same thing, of Google suddenly changing the criteria or the way they report on errors in Google Webmaster Tools. I would appreciate any insights from the community! Best regards Peru
Technical SEO | | SMVSEO0 -
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
Don't reach to make our site back in rankings
My URL is: http://tinyurl.com/nslu78 Hi, I really hope someone can help because my site seems to be penalized since last year now. Because we were not SEO experts but doctors and wanted to do things in a white hat way so we have given our SEO strategy (on-site and off-site) to the best US SEO agencies and now we are penalized. We was ranking on the 1st page with 15 keywords and now we don't even are in the first 10 pages. I know that our sector is suspicious but we are a real laboratory and our site is 100% transparent. I understand that a lot of people can think that we are all the same but this is not true, we are not a virtual company that don't even show their name or address, we show name, address, phone number, fax, email, chat service, VAT number everything so please help us. We have spent 3 months analysing every paragraph of google guidelines to see if we were violating some rule such as hidden text, link schemes, redirections, keyword stuffing, maleware, duplicate content etc.. and found nothing except little things but maybe we are not good enough to find the problem. In 3 months we have passed from 85 toxic links to 24 and from 750 suspicious links to 300. we have emailed, and call all the webmasters of each site several times to try to delete as many links as possible.We have sent to google a big excel with all our results and attempts to delete those badlinks. We have then sent a reconsideration request explaining all the things that we have verified on-site and off-site but it seems that it didn't worked because we are still penalized. I really hope someone can see where the problem is.
Technical SEO | | andromedical
thank you0 -
Google insists robots.txt is blocking... but it isn't.
I recently launched a new website. During development, I'd enabled the option in WordPress to prevent search engines from indexing the site. When the site went public (over 24 hours ago), I cleared that option. At that point, I added a specific robots.txt file that only disallowed a couple directories of files. You can view the robots.txt at http://photogeardeals.com/robots.txt Google (via Webmaster tools) is insisting that my robots.txt file contains a "Disallow: /" on line 2 and that it's preventing Google from indexing the site and preventing me from submitting a sitemap. These errors are showing both in the sitemap section of Webmaster tools as well as the Blocked URLs section. Bing's webmaster tools are able to read the site and sitemap just fine. Any idea why Google insists I'm disallowing everything even after telling it to re-fetch?
Technical SEO | | ahockley0 -
Google Webmaster Site Performance
In webmaster tools, under labs/site performance google provides your ave page load time. When google grades a page, does it use how long that specific page loads -or- Does google use the overall ave page load time for the domain as provided in lab/site performance
Technical SEO | | Bucky0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0