Penalty for Mixing Microdata with Metadata
-
The folks that built our website have insisted on including microdata and metadata on our pages.
What we end up with is something that looks like this in the header:
itemprop="description" content="Come buy your shoes from us, we've got great shoes.">
Seems to me that this would be a bad thing, however I can't find any info leaning one way or the other.
Can anyone provide insight on this?
-
Worth noting that meta desc isn't one of those 3 markup styles. it is a different thing completely so you aren't actually mixing schema in your example.
-
Thanks for sharing that link. That post is very informative.
-
Thanks for answering so quickly.
When I said "bad thing" I meant that I don't see how such redundancy could ever be beneficial.
Thank you for your thoughts.
-
I would read this post for more information: http://www.seomoz.org/blog/schema-examples
The post discusses how Google used to support 3 different styles of Markup but with the creation of Schema.org, decided to only use that going forward. Any websites with existing markup would be okay though.
Google also mentioned (noted in the article above) that you should avoid mixing different types of markup formats on the same page as it can confuse their parsers.
-
Why do you think this would be a bad thing? I'd question how much benefit will be gained in most areas by doing this, but I can't see it causing harm and it is good to get in there now with this rather than adding it later (assuming you've backed the right format!).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Review Microdata if the product does not have a review
We have many products that do not have reviews and are causing warnings in googles microdata. Is there anything you can do for products without a review so you don't get errors for aggregate review or review??
Algorithm Updates | | jforthofer0 -
Do we have any risk or penalty for double canonicals?
Hi all, We have double canonicals. From page A to page B to Page C. Will this be Okay for Google? Or definitely we need to make it A to C and B to C? Thanks
Algorithm Updates | | vtmoz0 -
Puzzling Penalty Question - Need Expert Help
I'm turning to the Moz Community because we're completely stumped. I actually work at a digital agency, our specialism being SEO. We've dealt with Google penalties before and have always found it fairly easy to identify the source the problem when someone comes to us with a sudden keyword/traffic drop. I'll briefly outline what we've experienced: We took on a client looking for SEO a few months ago. They had an OK site, with a small but high quality and natural link profile, but very little organic visibility. The client is an IT consultancy based in London, so there's a lot of competition for their keywords. All technical issues on the site were addressed, pages were carefully keyword targeted (obviously not in a spammy way) and on-site content, such as services pages, which were quite thin, were enriched with more user focused content. Interesting, shareable content was starting to be created and some basic outreach work had started. Things were starting to pick up. The site started showing and growing for some very relevant keywords in Google, a good range and at different levels (mostly sitting around page 3-4) depending on competition. Local keywords, particularly, were doing well, with a good number sitting on page 1-2. The keywords were starting to deliver a gentle stream of relevant traffic and user behaviour on-site looked good. Then, as of the 28th September 2015, it all went wrong. Our client's site virtually dropped from existence as far as Google was concerned. They literally lost all of their keywords. Our client even dropped hundreds of places for their own brand name. They also lost all rankings for super low competition, non-business terms they were ranking for. So, there's the problem. The keywords have not shown any sign of recovery at all yet and we're, understandably, panicking. The worst thing is that we can't identify what has caused this catastrophic drop. It looks like a Google penalty, but there's nothing we can find that would cause it. There are no messages or warnings in GWT. The link profile is small but high quality. When we started the content was a bit on the thin side, but this doesn't really look like a Panda penalty, and seems far too severe. The site is technically sound. There is no duplicate content issues or plaigarised content. The site is being indexed fine. Moz gives the site a spam score of 1 (our of 11 (i think that's right)). The site is on an ok server, which hasn't been blacklisted or anything. We've tried everything we can to identify a problem. And that's where you guys come in. Any ideas? Anyone seen anything similar around the same time? Unfortunately, we can't share our clients' site's name/URL, but feel free to ask any questions you want and we'll do our best to provide info.
Algorithm Updates | | MRSWebSolutions0 -
Schema.org Microdata or Microformats - Which Should We Use
Hi All, I'm wondering what would be the better alternative - schema.org microdata or microformats. I am aware that search engines such as Google, Yahoo, and Bing recognize Schema.org as the standard. Question is, will it have any negative affect? Our web developer here says that schema.org microdata may result in negative html. I don't think that it will affect our SEO, but I guess that's also something to shed some light on. So, what's the consensus here - should we implement schema.org or go with microformats - or, does it really make any difference?
Algorithm Updates | | CSawatzky1 -
How to Link a Network of Sites w/o Penguin Penalties (header links)
I work for a network of sites that offer up country exclusive content. The content for the US will be different than Canada, Australia, Uk, etc.… but with the same subjects. Now to make navigation easy we have included in the header of every page a drop down that has links to the other countries, like what most of you do with facebook/twitter buttons. Now every page on every site has the same link, with the same anchor text. Example: Penguins in Canada Penguins in Australia Penguins in the USA Because every page of every site has the same links (it's in the header) the "links containing this anchor text" ratio is through the roof in Open Site Explorer. Do you think this would be a reason for penguin penalization? If you think this would hurt you, what would you suggest? no follow links? Remove the links entirely and create a single page of links? other suggestions?
Algorithm Updates | | BeTheBoss0 -
Product microdata from Schema.org
An article (http://www.websitemagazine.com/content/blogs/posts/archive/2011/11/18/step-up-your-e-commerce-seo-game-with-product-microdata.aspx?utm_source=newsletter&utm_medium=email&utm_campaign=newsletter) is claiming that using this product micro data (http://schema.org/Product) might help product pages rank better. Do you have any experience using these tags and would it be worth the time to implement these on a site with 1000's of products? Would it make sense to selectively implement them on specific products that actually have a good chance of ranking high instead?
Algorithm Updates | | pbhatt0 -
Duplicate Content & www.3quarksdaily.com, why no penalty?
Does anyone have a theory as to why this site does not get hit with a DC penalty? The site is great, and the information is good but I just cannot understand the reason that this site does not get hit with a duplicate content penalty as all articles are posted elsewhere. Any theories would be greatly appreciated!
Algorithm Updates | | KMack0 -
Penalty or Algorithm hit?
After the Google Algorithm was updated my site took a week hit in traffic. The traffic came back a week later and was doing well a week AFTER the algorithm change and I decided that I should do a 301 redirect to make sure I didn't have duplicate content (www. vs. http://) I called my hosting company (I won't name names but it rhymes w/ Low Fatty) and they guided me through the supposedly simple process.. Well, they had me create a new (different) IP address and do a domain forward (sorry about bad terminology) to the www. This was in effect for approximately 2 weeks before I discovered it and came along with a subsequent massive hit in traffic. I then corrected the problem (I hope) by restoring the old IP address and setting up the HTACESS file to redirect all to www. It is a couple weeks later and my traffic is still in the dumps. On WMT instead of getting traffic from 10,000 keywords I'm getting it only from 2k. Is my site the victim of some penalty (I have heard of sandbox) or is my site simply just lower in traffic due to the new algorithm (I checked analytics data to find that traffic only in the US is cut by 50%, it is the same outside the US) Could someone please tell me what is going on?
Algorithm Updates | | askthetrainer0