Schema Markup Errors - Priority or Not?
-
Greetings All...
I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn.
I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them.
My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have.
Any thoughts?
-
In that case I would say a phased approach would be fine. It is an issue that should be addressed, but I wouldn't classify it as "critical".
-
Thanks for the responses.
These changes will be managed by the developers, so it won't impact any other priorities that I have in the queue. My concern was whether this was pressing enough that I would have to push the developer towards a hard timeline vs a more phased one. I took a look at other clients on this platform and saw the same errors. So, this may be a platform wide markup error.
These are automotive clients, so their sites are OEM mandated. Therefore, I don't have any say in the types of markup they use. But, my goal is to impress upon them the fact that these errors could definitely negatively impact my clients (if not now, in the future).
-
I think it depends on what else you have in the queue for your clients. Without knowing what other priorities you have, I can't say whether this is more or less important. However, as Patrick said, correct markup helps search engines understand more about the context of what's on the page. It probably won't make a big ranking impact, and I doubt you're going to get penalized for some markup errors as long as you're not trying to spam using markup.
I definitely prefer Schema.org markup over Hcard, but rather than implementing it as per Schema.org you may want to consider JSON-LD - https://developers.google.com/schemas/formats/json-ld .
-
Hi there
Yes, I would consider this a priority as you want to have the most upto date and relevant markup on your site, so that crawlers can better understand your important information and attribute the right properties to your site/content.
I would also focus on Schema, as it's recognized by major search engines like Google, Bing, Yahoo, and Yandex, so you are pleasing multiple crawlers at once. Pages that have this markup also tend to rank four positions higher than pages without. So it is definitely worth the investment of time.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Errors for Form Generated Pages - No index, no follow or 301 redirect
Hi there I wonder if someone can help me out and provide the best solution for a problem with form generated pages. I have blocked the search results pages from being indexed by using the 'no index' tag, and I wondered if I should take this approach for the following pages. I have seen a huge increase in 404 errors since the new site structure and forms being filled in. This is because every time a form is filled in, this generates a new page, which only Google Search Console is reporting as a 404. Whilst some 404's can be explained and resolved, I wondered what is best to prevent Google from crawling these pages, like this: mydomain.com/webapp/wcs/stores/servlet/TopCategoriesDisplay?langId=-1&storeId=90&catalogId=1008&homePage=Y Implement 301 redirect using rules, which will mean that all these pages will redirect to the homepage. Whilst in theory this will protect any linked to pages, it does not resolve this issue of why GSC is recording as 404's in the first place. Also could come across to Google as 100,000+ redirected links, which might look spammy. Place No index tag on these pages too, so they will not get picked up, in the same way the search result pages are not being indexed. Block in robots - this will prevent any 'result' pages being crawled, which will improve the crawl time currently being taken up. However, I'm not entirely sure if the block will be possible? I would need to block anything after the domain/webapp/wcs/stores/servlet/TopCategoriesDisplay?. Hopefully this is possible? The no index tag will take time to set up, as needs to be scheduled in with development team, but the robots.txt will be an quicker fix as this can be done in GSC. I really appreciate any feedback on this one. Many thanks
Technical SEO | | Ric_McHale0 -
Dealing with 410 Errors in Google Webmaster Tools
Hey there! (Background) We are doing a content audit on a site with 1,000s of articles, some going back to the early 2000s. There is some content that was duplicated from other sites, does not have any external links to it and gets little or no traffic. As we weed these out we set them to 410 to let the Goog know that this is not an error, we are getting rid of them on purpose and so the Goog should too. As expected, we now see the 410 errors in the Crawl report in Google Webmaster Tools. (Question) I have been going through and "Marking as Fixed" in GWT to clear out my console of these pages, but I am wondering if it would be better to just ignore them and let them clear out of GWT on their own. They are "fixed" in the 410 way as I intended and I am betting Google means fixed as being they show a 200 (if that makes sense). Any opinions on the best way to handle this? Thx!
Technical SEO | | CleverPhD0 -
404 Error
Hello, Seomoz flagged a url as having a 404 client error. The reason the link doesn't return a proper content page is because the url name was changed. What should we do? Will this error disappear when Google indexes our site again? Or is there some way to manually eliminate it? Thanks!
Technical SEO | | OTSEO0 -
Help with bing redirection error
Can somebody help me figure out this bing redirect error. The link to "http://w******/flea-control" has resulted in HTTP redirection to "http://w******/feas/flea-control/".Search engines can only pass page rankings and other relevant data through a single redirection hop. Using unnecessary redirects can have a negative impact on page ranking. I am using wordpress. I am actually linking to the /feas/flea-control/ version. I have looked every where for help. I got this error using bings seo toftware
Technical SEO | | OxzenMedia0 -
Implementing Schema within Existing CSS tags
In implementing Schema with a site using CSS and containing existing tags, I want to be sure that we are (#1) using the tags effectively when used within a product detail template and (#2) not actually harming ourselves by telling Google that all products are named or described by the SS tag and not actually the product name or description (which obviously could be disasterous). An example of what we are looking at implementing is the following: Old: <ss:value source="$product.name"></ss:value> New: <ss:value source="$product.name"></ss:value> Old: <ss:value source="$product.description">New: <ss:value source="$product.description"></ss:value> Basically, is Schema at the point where the SS tag be replaced (in the eyes of the search engines) with the actual text and not the tag itself?</ss:value>
Technical SEO | | TechMama0 -
Will errors on a subdomain effect the overall health of the root domain?
As stated in the question, we have 2 sub domains that contain over 2000 reported errors from SEOMOZ. The root domain has a clean bill of health, and i was just wondering if these errors on the sub-domains could have a negative effect on the root domain in the eyes of Google. Your comments will be appreciated. Regards Greg
Technical SEO | | AndreVanKets0 -
404 Errors in Google Webmaster Tools
Hello, Google webmaster tools is returning our URLs as 404 errors: http://www.celebritynetworth.com/watch/D5GrrPEN9Oc/tom-mccarthy-floating/ When we enter the URL into the browser it loads the page just fine. Is there a way to determine why Google Webmaster Tools is returning a 404 error when the link loads perfectly fine in a browser? Thanks, Alex
Technical SEO | | Anti-Alex0 -
Crawl Errors and Duplicate Content
SEOmoz's crawl tool is telling me that I have duplicate content at "www.mydomain.com/pricing" and at "www.mydomain.com/pricing.aspx". Do you think this is just a glitch in the crawl tool (because obviously these two URL's are the same page rather than two separate ones) or do you think this is actually an error I need to worry about? Is so, how do I fix it?
Technical SEO | | MyNet0