Schema Markup Errors - Priority or Not?
-
Greetings All...
I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn.
I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them.
My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have.
Any thoughts?
-
In that case I would say a phased approach would be fine. It is an issue that should be addressed, but I wouldn't classify it as "critical".
-
Thanks for the responses.
These changes will be managed by the developers, so it won't impact any other priorities that I have in the queue. My concern was whether this was pressing enough that I would have to push the developer towards a hard timeline vs a more phased one. I took a look at other clients on this platform and saw the same errors. So, this may be a platform wide markup error.
These are automotive clients, so their sites are OEM mandated. Therefore, I don't have any say in the types of markup they use. But, my goal is to impress upon them the fact that these errors could definitely negatively impact my clients (if not now, in the future).
-
I think it depends on what else you have in the queue for your clients. Without knowing what other priorities you have, I can't say whether this is more or less important. However, as Patrick said, correct markup helps search engines understand more about the context of what's on the page. It probably won't make a big ranking impact, and I doubt you're going to get penalized for some markup errors as long as you're not trying to spam using markup.
I definitely prefer Schema.org markup over Hcard, but rather than implementing it as per Schema.org you may want to consider JSON-LD - https://developers.google.com/schemas/formats/json-ld .
-
Hi there
Yes, I would consider this a priority as you want to have the most upto date and relevant markup on your site, so that crawlers can better understand your important information and attribute the right properties to your site/content.
I would also focus on Schema, as it's recognized by major search engines like Google, Bing, Yahoo, and Yandex, so you are pleasing multiple crawlers at once. Pages that have this markup also tend to rank four positions higher than pages without. So it is definitely worth the investment of time.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird 404 errors in Webmaster Tools
Hi, In a regular check with Webmaster Tools, I have noticed some weird 404 errors, for example, my domain URL is something like http://domainname.com/, the 404 error points to some weird URLs like http://domainname.com/james-bond&page=2/ and http://domainname.com/juegos-de&page=3/, at first I have tried to block them by robots.txt, but now I am getting these kind of 404 errors a lot, and don't think blocking them all is a perfect solution. Can anyone help me out with the issue? Thank you in advance.
Technical SEO | | nishthaj
cheers.0 -
Assistance with High Priority Duplicate Page Content Errors
Hi I am trying to fix the high priority duplicate content URL's from my recent MOZ crawl (6 URL's) in total. Would someone from the community be able to offer some web development advice? I had reached out on the Moz Community on the main welcome page. Samantha stated that someone in web development on Moz's Q&A forum would be better suited to assist me. I took a word press class on Lynda.com, but other than that, I am a novice. I manage my site www.rejuvalon.com on Go Daddy's managed wordpress site. Thanks so much for your help! Best, Jill
Technical SEO | | justjilly0 -
How to fix an 803 error?
Error Code 803: Incomplete HTTP Response Received How can I fix this error?
Technical SEO | | netprodjb0 -
Is Schema markup important for SEO?
I have recently come across the Schema markup and have researched what it's all about. Basically in short it helps search engines identify all the elements of the page. Our competitors have this implementated, so is this important for SEO? We have an e commerce store.
Technical SEO | | Jseddon920 -
Has anyone else gotten strange WMT errors recently?
Yesterday, one of my sites got this message from WMT: "Over the last 24 hours, Googlebot encountered 1 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 100.0%." I did a fetch as Googlebot and everything seems fine. Also, the site is not seeing a decrease in traffic. This morning, a client for which I am doing some unnatural links work emailed me about a site of his that got this message: "Over the last 24 hours, Googlebot encountered 1130 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%." His robots.txt looks fine to me. Is anyone else getting messages like this? Could it be a WMT bug?
Technical SEO | | MarieHaynes1 -
500 error codes caused by W3 Total Cache plugin?
Hello Everyone, I operate a site (http://www.nationalbankruptcyforum.com) that has been receiving 500 error codes in Webmaster Tools as of late. This morning, webmaster tools showed 129 500 crawling errors. I've included one of the URLs that contained an error message here: http://www.nationalbankruptcyforum.com/marriage-and-bankruptcy/do-my-wife-and-i-both-have-to-file-for-bankruptcy/ I've been getting these errors now for about 3 weeks and they've mostly been on obscure, strange URLs (lots of numbers etc.) however, this morning they started showing up on pages that will actually be trafficked by users. I'm really not sure where they're coming from, although I do believe it's a software issue as I've had my hosting company take a look to no avail. I have had some development work done recently and am running the W3 Total Cache plugin (my site is built on WP). I also run the Yoast SEO plugin and rely on it to publish an XML sitemap among other things. Anyone have any idea where these 500 errors originate from? Thanks, John
Technical SEO | | oconn1460 -
Schema.org : reviewCount & ratingValue - best practices for implementation?
I'd like to add our merchant review count and rating to our site and use the schema.org markup to indicate to the search engines what these are. The reason I'd like to do this is so that the star rating pulls through to the organic listing. Check out this example from several UK tight sites. Notice how the organic listings display the star rating... My questions are: Has anyone seen an example of this from Google.com (US site) I heard that you should only add this markup to the homepape - but couldn't find any Google documentation to back this up. Do you know if this can be applied throughout the site w/o penalty? Thanks everyone!
Technical SEO | | evoNick0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0