Schema Markup Errors - Priority or Not?
-
Greetings All...
I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn.
I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them.
My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have.
Any thoughts?
-
In that case I would say a phased approach would be fine. It is an issue that should be addressed, but I wouldn't classify it as "critical".
-
Thanks for the responses.
These changes will be managed by the developers, so it won't impact any other priorities that I have in the queue. My concern was whether this was pressing enough that I would have to push the developer towards a hard timeline vs a more phased one. I took a look at other clients on this platform and saw the same errors. So, this may be a platform wide markup error.
These are automotive clients, so their sites are OEM mandated. Therefore, I don't have any say in the types of markup they use. But, my goal is to impress upon them the fact that these errors could definitely negatively impact my clients (if not now, in the future).
-
I think it depends on what else you have in the queue for your clients. Without knowing what other priorities you have, I can't say whether this is more or less important. However, as Patrick said, correct markup helps search engines understand more about the context of what's on the page. It probably won't make a big ranking impact, and I doubt you're going to get penalized for some markup errors as long as you're not trying to spam using markup.
I definitely prefer Schema.org markup over Hcard, but rather than implementing it as per Schema.org you may want to consider JSON-LD - https://developers.google.com/schemas/formats/json-ld .
-
Hi there
Yes, I would consider this a priority as you want to have the most upto date and relevant markup on your site, so that crawlers can better understand your important information and attribute the right properties to your site/content.
I would also focus on Schema, as it's recognized by major search engines like Google, Bing, Yahoo, and Yandex, so you are pleasing multiple crawlers at once. Pages that have this markup also tend to rank four positions higher than pages without. So it is definitely worth the investment of time.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Schema query
Hello All, I have implemented schema on product page. On My product page at left section there is one section i.e. "Popular Products" in that 5 Popular products are listed. Now when I visit ABCD product page then in "Popular Product" section also out of 5 products this ABCD product also listed. Finally when I check structured data testing tool in that following details are available for products - @type, @id, image, name, url, sku, category, description, offers & offers all details, Brand & brands all details ...now after that there in "Is Related To" in that five products following details are there - @type, @id, image, name, url & offers that's it. So my query is, is this consider as duplicate? or no issue at all with google? Thanks! 6dHvQ
Technical SEO | | wright3350 -
Breadcrumb Trail Markup
I've implemented breadcrumb trail markup on my site and I wanted to check that it's implemented correctly. Is it true that the breadcrumb does not link to the current page as this would be redundant, so it only links to 'ancestor' pages? (You can see for example on this page that the breadcrumb links to two levels below the home page): http://www.cobaltrecruitment.com/sectors/construction-property-engineering/real-estate-general-practice So basically breadcrumb trail should have a simple link for each level (except for the current page). Thanks,
Technical SEO | | the-gate-films0 -
Is Schema markup important for SEO?
I have recently come across the Schema markup and have researched what it's all about. Basically in short it helps search engines identify all the elements of the page. Our competitors have this implementated, so is this important for SEO? We have an e commerce store.
Technical SEO | | Jseddon920 -
GWT giving me 404 errors based on old and deleted site map
I'm getting a bunch of 404 crawl errors in my Google Webmaster Tools because we just moved our site to a new platform with new URL structure. We 301 redirected all the relevant pages. We submitted a new site map and then deleted all the site maps to the old website url structure. However, google keeps crawling the OLD urls and reporting back the 404 errors. It says that the website is linking to these 404 pages via an old outdated sitemap (which if you goto shows a 404 as well, so it's not as if Google is reading these old site maps now). Instead it's as if Google has cached the old sitemap but continues to use it to crawl these non-existent pages. Any thoughts?
Technical SEO | | Santaur0 -
Spike in server errors
Hi, we've recently changed shopping cart platforms. In doing so a lot of our URL's changed, but I 301'ed all of the significant landing pages (as determined by G Analytics) prior to the switch. However, WMT is warning me about this spike in server errors now with all the pages that no longer exist. However they are only crawling them because they used to exist/are linked from pages that used to exist. and no longer actually exist. Is this something I should worry about? Or let it run its course?
Technical SEO | | absoauto0 -
Error on Magento database 301 bulk update
Hi all, One of my client has a magento website and I recently received received 404 errors for about 600 links on GWT and I tried to give 301 redirection via bulk upload but i get errors. It's magento 1.7 and I have following columns on csv file. I included first sample row as well. <colgroup><col width="120"><col width="71"><col width="120"><col width="402"><col width="253"><col width="120"><col width="120"><col width="120"><col width="120"><col width="120"></colgroup>
Technical SEO | | sedamiran
| url_rewrite_id | store_id | id_path | request_path | target_path | is_system | options | description | category_id | product_id |
| 125463 | 1 | 22342342_54335 | old_link | new_link | 0 | RP | NULL | NULL | NULL | | | | | | | | | | | | The error msg I receive is below. I was wondering if anyone has tried this before and if you know you how to fix this. Manual redirection works fine but probably this first 600 error is just a start, I'll be getting more 404 errors soon, somehow i need to figure out how to fix this. I appreciate if any one has experience on this and guide me through. Thanks in advance, Here is the error: SQL query: INSERT INTO 'mgn_core_url_rewrite'
VALUES ( 'url_rewrite_id', 'store_id', 'id_path', 'request_path', 'target_path', 'is_system', 'options', 'description', 'category_id', 'product_id' )MySQL said: #1452 - Cannot add or update a child row: a foreign key constraint fails ('ayb_mgn2'.'mgn_core_url_rewrite', CONSTRAINT 'FK_101C92B9EEB71CACE176D24D46653EBA' FOREIGN KEY ('category_id') REFERENCES 'mgn_catalog_category_entity' ('entity_id') ON DELETE CASCADE ON) <colgroup><col width="120"><col width="71"><col width="120"><col width="402"><col width="253"><col width="120"><col width="120"><col width="120"><col width="120"><col width="120"></colgroup>
| | | | | | | | | | |1 -
Bogus Crawl Errors in Webmaster Tools?
I am suddenly seeing a ton of crawl errors in webmaster tools. Almost all of them are URL links coming from scraper sites.that I do not own. Do you see these in your Webmaster Tools account? Do you mark them as "fixed" if they are on a scraper site? There are waaaay too many of these to make redirects. Thanks!
Technical SEO | | EGOL0