Penalty for Mixing Microdata with Metadata
-
The folks that built our website have insisted on including microdata and metadata on our pages.
What we end up with is something that looks like this in the header:
itemprop="description" content="Come buy your shoes from us, we've got great shoes.">
Seems to me that this would be a bad thing, however I can't find any info leaning one way or the other.
Can anyone provide insight on this?
-
Worth noting that meta desc isn't one of those 3 markup styles. it is a different thing completely so you aren't actually mixing schema in your example.
-
Thanks for sharing that link. That post is very informative.
-
Thanks for answering so quickly.
When I said "bad thing" I meant that I don't see how such redundancy could ever be beneficial.
Thank you for your thoughts.
-
I would read this post for more information: http://www.seomoz.org/blog/schema-examples
The post discusses how Google used to support 3 different styles of Markup but with the creation of Schema.org, decided to only use that going forward. Any websites with existing markup would be okay though.
Google also mentioned (noted in the article above) that you should avoid mixing different types of markup formats on the same page as it can confuse their parsers.
-
Why do you think this would be a bad thing? I'd question how much benefit will be gained in most areas by doing this, but I can't see it causing harm and it is good to get in there now with this rather than adding it later (assuming you've backed the right format!).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have an https page with an http img that redirects to an https img, is it still considered by google to be a mixed content page?
With Google starting to crack down on mixed content I was wondering, if I have an https page with an http img that redirects to an https img, is it still considered by Google to be a mixed content page? e.g. In an old blog article, there are images that weren't updated when the blog migrated to https, but just 301ed to new https images. is it still considered a mixed content page?
Algorithm Updates | | David-Stern0 -
Do we have any risk or penalty for double canonicals?
Hi all, We have double canonicals. From page A to page B to Page C. Will this be Okay for Google? Or definitely we need to make it A to C and B to C? Thanks
Algorithm Updates | | vtmoz0 -
Schema.org Microdata or Microformats - Which Should We Use
Hi All, I'm wondering what would be the better alternative - schema.org microdata or microformats. I am aware that search engines such as Google, Yahoo, and Bing recognize Schema.org as the standard. Question is, will it have any negative affect? Our web developer here says that schema.org microdata may result in negative html. I don't think that it will affect our SEO, but I guess that's also something to shed some light on. So, what's the consensus here - should we implement schema.org or go with microformats - or, does it really make any difference?
Algorithm Updates | | CSawatzky1 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Google penalty for one keyword?
Is it possible to get penalized by Google for a specific keyword and essentially disappear from the SERPs for that keyword but keep position for the brand (#1) and some other keywords (#4 and #7)? And how would you find out that this is what happened if there is no GWT message?
Algorithm Updates | | gfiedel0 -
[G Penalty?] Significant Traffic Drop From All Sources
My client's traffic started to significantly decrease around Nov 21 (Panda update 22). This includes traffic from all sources - search engines (G, B, & Y!), direct, AND referral. At first we thought it was a G penalty but G answered our reconsideration request by stating that no manual penalty had occured. It could be algo penalty, but again, the site has been hit across all sources. Client has done zero backlinking - it is all natural. No Spam, etc.. All of his on-site SEO is perfect (700+ pages indexed, all unique content, unique title and desc). On Oct 16, he switched from his old URL to a new URL and did proper redirects. (Last year - Dec 2011 - he switched his CMS to Drupal and although there was a temporary decrease in traffic, it showed recovery within a month or so.) He does zero social on his site and he has many ads above the fold. Nevertheless, the traffic decrease is not source specific. In other words, all sources have decreased since Nov 21, 2012 and have not recovered. What is going on? What can be the explanation for decrease in traffic across all sources? This would be easy to answer if it was only Google Organic decrease but since direct and referral have also been hit, we cannot locate the problem. Please share your personal experiences as well as advice on where we should look. Could this be negative SEO? Where would we look? ANY ADVICE IS WELCOME !!!! Every bit counts Thanks!!
Algorithm Updates | | GreenPush0 -
Title Tags and Over Optimization Penalty
In the past, it was always a good thing to put your most important keyword or phrase at the beginning of the Title Tag with the company name at the end. Now according to the over optimization penalty in the Whiteboard Friday video, it seems to be better to be more human and put the company name at the beginning with the keyword or phrase following. Am I understanding this correctly?
Algorithm Updates | | hfranz0 -
Product microdata from Schema.org
An article (http://www.websitemagazine.com/content/blogs/posts/archive/2011/11/18/step-up-your-e-commerce-seo-game-with-product-microdata.aspx?utm_source=newsletter&utm_medium=email&utm_campaign=newsletter) is claiming that using this product micro data (http://schema.org/Product) might help product pages rank better. Do you have any experience using these tags and would it be worth the time to implement these on a site with 1000's of products? Would it make sense to selectively implement them on specific products that actually have a good chance of ranking high instead?
Algorithm Updates | | pbhatt0