Duplicate Schema Syntax
-
Is having both JSON and Microdata markup on one site detrimental to SEO? I'm unsure if Google would read it as spammy to have both.
-
Thanks, Paddy!
-
Thank you for digging that up, Alex!
-
Hi Christine,
As Alex said, this shouldn't be a problem. I'd just advise keeping things consistent where you can, if only from a maintenance/debugging perspective so that if something breaks, it's easier to diagnose.
You should also make sure that markup is consistent between mobile and desktop sites if you have separate sites or change content based on each one.
Cheers.
Paddy
-
Google's Gary Illyes has specifically stated that it wouldn't be a problem when someone else asked the same question:
"Is there an issue (ie. spammy structured markup manual action) with marking up the same content using both JSON-LD & microdata?"
Gary Illyes (@methode)
"no, that shouldn't be a problem"
https://twitter.com/methode/status/793628458928578560
Is there an issue (ie. spammy structured markup manual action) with marking up the same content using both JSON-LD & microdata?17Gary "鯨理" Illyes@methodeReplying to@jenstarno, that shouldn't be a problem
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I have multiple GeoShape Schema for one page on one domain?
Hi Mozers, I'm working on some Schema for a client of mine, but whilst doing the research on GeoShapes with my developer, we came across a potential issue with this particular mark-up. My client is B2C business, operating in numerous places across the UK. I want to use the Circle property from GeoShape to draw out multiple circles across the UK, but am I able to do this? From looking at some other websites, most seem to just have one GeoShape. Can I have multiple on the same page and same domain? Thanks! Virginia
Local Website Optimization | | Virginia-Girtz0 -
Wordpress Blog, Schema and Authorship Settings
Hi Everyone, What is the best practice for authorship in 2018 and going forward? I am moving my entire blog over to a new wordpress theme so it's easier to read and navigate in an attempt to make it look better on the mobile and give better UX / CRO and implicit user feedback signals to google. On the old blog I would say who the author is in the URL, H1 and in the content. This includes an image of the author with an image alt with their name, qualifications and blurb. I've now set up each author as a 'user' for the new blog and their image and name comes up because I've marked those blogs as authored by that particular user in Wordpress. What should I do as far as the SEO elements are concerned? I have read Eric Enge's blog about authorship being dead here and also that authorship should be marked up in schema correctly - which I've done. Also I've read around how it provides indirect signals even though it's no longer a direct ranking factor. Should I tell wordpress to ignore the authorship SEO element by unticking the boxes relating to publishing authorship or let wordpress just do it's thing? Should I keep the images and alt tags and H1 in there or take them out and let the wordpress system take over the authorship SEO elements? It's going to look funny to have author (in wordpress theme) and then author details again just below? So what is the best practice for authorship in 2018 and going forward? Am I making too big a deal of it and can just let wordpress sort it out. Something it seems to do very well? Thanks in advance, Ed.
Local Website Optimization | | Smileworks_Liverpool0 -
Duplicate Content - Local SEO - 250 Locations
Hey everyone, I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc. I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates. So here's my question: If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct? Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations. I really appreciate any insight! Thank you,
Local Website Optimization | | SEOJedi510 -
Massive duplicate content should it all be rewritten?
Ok I am asking this question to hopefully confirm my conclusion. I am auditing a domain who's owner is frustrated that they are coming in #2 for their regionally tagged search result and think its their Marketer/SEOs fault. After briefly auditing their site, the marketing company they have doing their work has really done a great job. There are little things that I have suggested they could do better but nothing substantial. They are doing good SEO for the most part. Their competitor site is ugly, has a terrible user experience, looks very unprofessional, and has some technical SEO issues from what I have seen so far. Yet it is beating them every time on the serps. I have not compared backlinks yet. I will in the next day or so. I was halted when I found, what seems to me to be, the culprit. I was looking for duplicate content internally, and they are doing fine there, then my search turned externally...... I copied and pasted a large chunk of one page into Google and got an exact match return.....rutro shaggy. I then found that there is another site from a company across the country that has identical content for possibly as much as half of their entire domain. Something like 50-75 pages of exact copy. I thought at first they must have taken it from the site I was auditing. I was shocked to find out that the company I am auditing actually has an agreement to use the content from this other site. The marketing company has asked the owners to allow them to rewrite the content but the owners have declined because "they like the content." So they don't even have authority on the content for approximately 1/2 of their site. Also this content is one of three main topics directed to from home page. My point to them here is that I don't think you can optimize this domain enough to overcome the fact that you have a massive portion of your site that is not original. I just don't think perfect optimization of duplicate content beats mediocre optimization of original content. I now have to convince the owners they are wrong, never an easy task. Am I right or am I over estimating the value of original content? Any thoughts? Thanks in advance!
Local Website Optimization | | RossM0 -
Implementation advice on fighting international duplicate content
Hi All, Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation. The situation is that we have 5 sites with similar content. Out of these 5: 2 use the same URL stucture and have no suffix 2 have a different URL structure with a .html suffix 1 has an entirely different URL structure with a .asp suffix The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level). 4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content. Is there an easy way to go about this or is the only way a manual addition? Has anyone had a similar experience? Your advice will be greatly appreciated. Many thanks, Emeka.
Local Website Optimization | | OptiBacUK0 -
Does Schema Replace Conventional NAP in local SEO?
Hello Everyone, My question is in regards to Schema and whether the it replaces the need for the conventional structured data NAP configuration. Because you have the ability to specifically call out variables (such as Name, URL, Address, Phone number ect.) is it still necessary to keep the NAP form-factor that has historically been required for local SEO? Logically it makes sense that schema would allow someone to reverse this order and still achieve the same result, however I have yet to find any conclusive evidence of this being the case. Thanks, and I look forward to what the community has to say on this matter.
Local Website Optimization | | toddmumford0 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0 -
How do I fix duplicate content issues if the pages are really just localized versions?
Does this still hurt our SEO? Should we place different countries on their own respective domains (.co.uk, etc)?
Local Website Optimization | | fdmgroup0