Penalty for Mixing Microdata with Metadata
-
The folks that built our website have insisted on including microdata and metadata on our pages.
What we end up with is something that looks like this in the header:
itemprop="description" content="Come buy your shoes from us, we've got great shoes.">
Seems to me that this would be a bad thing, however I can't find any info leaning one way or the other.
Can anyone provide insight on this?
-
Worth noting that meta desc isn't one of those 3 markup styles. it is a different thing completely so you aren't actually mixing schema in your example.
-
Thanks for sharing that link. That post is very informative.
-
Thanks for answering so quickly.
When I said "bad thing" I meant that I don't see how such redundancy could ever be beneficial.
Thank you for your thoughts.
-
I would read this post for more information: http://www.seomoz.org/blog/schema-examples
The post discusses how Google used to support 3 different styles of Markup but with the creation of Schema.org, decided to only use that going forward. Any websites with existing markup would be okay though.
Google also mentioned (noted in the article above) that you should avoid mixing different types of markup formats on the same page as it can confuse their parsers.
-
Why do you think this would be a bad thing? I'd question how much benefit will be gained in most areas by doing this, but I can't see it causing harm and it is good to get in there now with this rather than adding it later (assuming you've backed the right format!).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have an https page with an http img that redirects to an https img, is it still considered by google to be a mixed content page?
With Google starting to crack down on mixed content I was wondering, if I have an https page with an http img that redirects to an https img, is it still considered by Google to be a mixed content page? e.g. In an old blog article, there are images that weren't updated when the blog migrated to https, but just 301ed to new https images. is it still considered a mixed content page?
Algorithm Updates | | David-Stern0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Anyone experience google penalties for full-screen pop-ups?
Although we always recommend against onload pop-ups for clients, (we feel the effect the user experience) we do have a few clients that insist on them. I was reading this article the other day https://searchenginewatch.com/2016/05/17/how-do-i-make-sure-my-site-is-mobile-friendly/ which lead me to https://support.google.com/webmasters/answer/6101188 and I'm happy to see that Google is going to consider these types of content a downgrade when it comes to rank. My question is 2 fold: Has anyone experienced a drop in organic traffic on mobile due to this update? and do you think this will include user triggered content like photo galleries, bookings, email sign ups? We haven't noticed any drops yet but it is something we will be keeping a close eye on in the next little while. Let's hear what the community has to say 🙂
Algorithm Updates | | VERBInteractive1 -
Can I only submit a reconsideration request if I have a penalty?
Hey guys, One of the sites I'm looking after took a hit with their rankings (particularly for one keyword that went from 6/7 to 50+) post-Penguin in May. Although, after cleaning-up the link profile somewhat we started to see some slow and steady progression in positions. The keyword that dropped to 50+ was moving upwards in advance of 20. However, a couple of weeks back, the keyword in question took another slide towards 35-40. I therefore wondered whether it would be best to submit a reconsideration request - even though the site did not receive a manual penalty. The website has a DA of 40 which more than matches a lot of the competitor websites that are ranking on first page for the aforementioned keyword. At this stage, I would have expected the site to have returned to its original ranking - four-and-a-half months after Penguin - but it hasn't. So a reconsideration request seemed logical. That said, when I came to go through the process on Webmaster Tools I was unable to find the option! Has it now been removed for sites that don't receive manual penalties?
Algorithm Updates | | Webrevolve1 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Google penalty for one keyword?
Is it possible to get penalized by Google for a specific keyword and essentially disappear from the SERPs for that keyword but keep position for the brand (#1) and some other keywords (#4 and #7)? And how would you find out that this is what happened if there is no GWT message?
Algorithm Updates | | gfiedel0 -
Forum software penalties
I'm hoping to solicit some feedback on what people feel would be SEO best practices for message board/forum software. Specifically, while message boards that are healthy can generate tons of unique content, they also can generate a fair share of thin content pages. These pages include... Calendar pages that can have a page for each day of each month for 10 years! (thats like 3650 pages of just links). User Profile pages, which depending on your setup can tend to be thin. The board I work with has 20k registered members, hence 20k user profile pages. User lists which can have several hundred pages. I believe Google is pretty good at understanding what is message board content, but there is still a good chance that one could be penalized for these harmless pages. Do people feel that the above pages should be noindexed? Another issue is that of unrelated content. Many forums have their off-topic areas (the Pub or Hangout or whatever). On our forum up to 40% of the content is off-topic (when I say content I mean number of post versus raw word count). What are the advantages and disadvantages of such content? On one hand they expand the keywords you can rank for. On the other hand it might generate google organic traffic which you might now want because of a high bounce rate. Does too much indexable content that is unique dilute your good content?
Algorithm Updates | | entropytc1 -
Is this a possible Google penalty scenario?
In January we were banned from Google due to duplicate websites because of a server configuration error by our previous webmaster. Around 100 of our previously inactive domain names were defaulted to the directory of our company website during a server migration, thus showing the exact same site 100 times... obviously Google was not game and banned us. At the end of February we were allowed back into the SERPS after fixing the issue and have since steadily regained long-tail keyword phrase rankings, but in Google are still missing our main keyword phrase. This keyword phrase brings in the bulk of our best traffic, so obviously it's an issue. We've been unable to get above position 21 for this keyword, but in Yahoo, Bing, and Yandex (Russian SE) we're positions 3, 3, and 7 respectively. It seems to me there has to be a penalty in effect, as this keyword gets between 10 and 100 times as much traffic in Google than any of the ones we're ranked for, what do you think? EDIT: I should mention in the 4-5 years prior to the banning we had been ranked between 15 and 4th in Google, 80% of the time on the first page.
Algorithm Updates | | ACann0