Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Schema Markup Errors - Priority or Not?
-
Greetings All...
I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn.
I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them.
My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have.
Any thoughts?
-
In that case I would say a phased approach would be fine. It is an issue that should be addressed, but I wouldn't classify it as "critical".
-
Thanks for the responses.
These changes will be managed by the developers, so it won't impact any other priorities that I have in the queue. My concern was whether this was pressing enough that I would have to push the developer towards a hard timeline vs a more phased one. I took a look at other clients on this platform and saw the same errors. So, this may be a platform wide markup error.
These are automotive clients, so their sites are OEM mandated. Therefore, I don't have any say in the types of markup they use. But, my goal is to impress upon them the fact that these errors could definitely negatively impact my clients (if not now, in the future).
-
I think it depends on what else you have in the queue for your clients. Without knowing what other priorities you have, I can't say whether this is more or less important. However, as Patrick said, correct markup helps search engines understand more about the context of what's on the page. It probably won't make a big ranking impact, and I doubt you're going to get penalized for some markup errors as long as you're not trying to spam using markup.
I definitely prefer Schema.org markup over Hcard, but rather than implementing it as per Schema.org you may want to consider JSON-LD - https://developers.google.com/schemas/formats/json-ld .
-
Hi there
Yes, I would consider this a priority as you want to have the most upto date and relevant markup on your site, so that crawlers can better understand your important information and attribute the right properties to your site/content.
I would also focus on Schema, as it's recognized by major search engines like Google, Bing, Yahoo, and Yandex, so you are pleasing multiple crawlers at once. Pages that have this markup also tend to rank four positions higher than pages without. So it is definitely worth the investment of time.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Errors In Search Console
Hi All, I am hoping someone might be able to help with this. Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page). When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop. These are: wp-admin/admin-ajax.php showing as response code 400 and also xmlrpc.php showing as response code 405 robots.txt is as follows: user-agent: * disallow: /wp-admin/ allow: /wp-admin/admin-ajax.php Any help with what is wrong here and how to fix it would be greatly appreciated. Many Thanks
Technical SEO | | DaleZon0 -
Size of image for article Schema
Hi, I implemented schema markup for an article and all tested fine and I can see it being fired in preview mode of Google Tag Manager. But when I run the URL which has it applied through Google Structured Testing tool it is not appearing. I have now read that the image needs to be a certain size. For AMP articles this appears to be 12oo pixels wide http://www.thesempost.com/google-changes-image-size-requirements-amp-articles/ But what about non-AMP articles? Does it need to be that big too?
Technical SEO | | AL123al0 -
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
Exclude price in rich snippet markup
Our site has their prices hidden for non logged in users. Its a woocommerce built site and the rich snippet markups are added by woocommerce. I would like to remove the markup for the price becouse : 1, we would like our customers to register for prices. 2 i dont want to get penalties for not showing the same thing to visitors as to "google" .. Any help or thoughts on this one? Thanks / Jonas
Technical SEO | | knubbz0 -
404 errors
Hi I am getting these show up in WMT crawl error any help would be very much appreciated | ?escaped_fragment=Meditation-find-peace-within/csso/55991bd90cf2efdf74ec3f60 | 404 | 12/5/15 |
Technical SEO | | ReSEOlve
| | 2 | mobile/?escaped_fragment= | 404 | 10/26/15 |
| | 3 | ?escaped_fragment=Tips-for-a-balanced-lifestyle/csso/1 | 404 | 12/1/15 |
| | 4 | ?escaped_fragment=My-favorite-yoga-spot/csso/5598e2130cf2585ebcde3b9a | 404 | 12/1/15 |
| | 5 | ?escaped_fragment=blog/c19s6 | 404 | 11/29/15 |
| | 6 | ?escaped_fragment=blog/c19s6/Tag/yoga | 404 | 11/30/15 |
| | 7 | ?escaped_fragment=Inhale-exhale-and-once-again/csso/2 | 404 | 11/27/15 |
| | 8 | ?escaped_fragment=classes/covl | 404 | 10/29/15 |
| | 9 | m/?escaped_fragment= | 404 | 10/26/15 |
| | 10 | ?escaped_fragment=blog/c19s6/Page/1 | 404 | 11/30/15 | | |0 -
Expired domain 404 crawl error
I recently purchased a Expired domain from auction and after I started my new site on it, I am noticing 500+ "not found" errors in Google Webmaster Tools, which are generating from the previous owner's contents.Should I use a redirection plugin to redirect those non-exist posts to any new post(s) of my site? or I should use a 301 redirect? or I should leave them just as it is without taking further action? Please advise.
Technical SEO | | Taswirh1 -
429 Errors?
I have over 500,000 429 errors in webmaster tools. Do I need to be concerned about these errors?
Technical SEO | | TheKrazyCouponLady0 -
Schema for Price Comparison Services - Good or Bad?
Hey guys, I was just wondering what the whole schema.org markup means for people that run search engines (i.e. for a niche, certain products) or price comparison engines in general. The intend behind schema.org was to help the engines better understand the pages content. Well, I guess such services don't necessarily want Google to understand that they're just another search engine (and thus might get thrown out of the index for polluting it with search result pages). I see two possible scenarios: either not implement them or implement them in a way that makes the site not look like an aggregator, i.e. by only marking up certain products with unique text. Any thoughts? Does the SEOmoz team has any advice on that? Best,
Technical SEO | | derderko
schuon0