Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does schema.org assist with duplicate content concerns
-
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites.
Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content?
These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us.
Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns.
Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
-
Thanks Takeshi - useful comments.
-
That's an interesting question. Semantic markup can be used to help Google understand what different pages are (i.e. tag pages), but it doesn't really solve the problems caused by duplicate content, namely:
- Thin Content - Tag pages and other similar pages are thin content, with not much utility for the user, and are probably not going to rank well in Google anyway. Even if they do rank, they won't convert as well as your main pages.
- Keyword Cannibalization - Even if your tag pages & duplicate content rank, they could potentially outrank your main content, leading to lower conversions.
- Panda - Too many thin content pages can lower Google's opinion of your site as a whole, leading to a Panda penalty.
Given the problems above, semantic markup doesn't really help with any of them. Semantic markup can help Google understand what a tag page is, but that doesn't mean you want to have that page indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in sidebar
Hi guys. So I have a few sentences (about 50 words) of duplicate content across all pages of my website (this is a repeatable text in sidebar). Each page of my website contains about 1300 words (unique content) in total, and 50 words of duplicate content in sidebar. Does having a duplicate content of this length in sidebar affect the rankings of my website in any way? Thank you so much for your replies.
On-Page Optimization | | AslanBarselinov1 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Duplicate content penalty
when moz crawls my site they say I have 2x the pages that I really have & they say I am being penalized for duplicate content. I know years ago I had my old domain resolve over to my new domain. Its the only thing that makes sense as to the duplicate content but would search engines really penalize me for that? It is technically only on 1 site. My business took a significant sales hit starting early July 2013, I know google did and algorithm update that did have SEO aspects. I need to resolve the problem so I can stay in business
On-Page Optimization | | cheaptubes0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Plagiarism or duplicate checker tool?
Do you know a plagiarism or duplicate checker tool where I can receive an email alert if someone copies my content? I know there's a tool like this (similar to http://www.tynt.com/ though people can still remove the link from the original source) but I forgot the name or site. It's like a source code that you must insert in each of your webpage. Thanks in advanced!
On-Page Optimization | | esiow20131 -
Duplicate eCommerce Product Descriptions
I know that creating original product descriptions is best practices. What I don't understand is how other sites are able to generate significant traffic while still using duplicate product descriptions on all product pages. How are they not being penalized by Google?
On-Page Optimization | | mj7750 -
Schema.org for news websites?
So as of late I have been on something of a mission to mark up my news website with as much accurate and detailed Schema and Open Graph data as possible, in order to not only allow the search engines to understand my content properly, but also to ensure everything appears in the most ideal fashion when linked to from Facebook, Google+, etc. Here is an example of a typical article page: http://www.nerdscoop.net/technology/video-games-459 As you'll see I currently have news posts marked up as article because that is essentially exactly what they are, but is there a better way to emphasise that they are news rather than just generic articles? My second question is regarding the category pages and the home page. How would be best to mark these up? With OG the task is fairly simple, because I can specify the homepage as being a website, but not so with Schema from what I can see. Either way, this is an interesting subject to me and I look forward to any discussion as a result. Thanks for looking.
On-Page Optimization | | HalogenDigital0 -
Percentage of duplicate content allowable
Can you have ANY duplicate content on a page or will the page get penalized by Google? For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse? If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content? thanks!
On-Page Optimization | | sportstvjobs0