How do I use public content without being penalized for duplication?
-
The NHTSA produces a list of all recalls for automobiles. In their "terms of use" it states that the information can be copied. I want to add that to our site, so there is an up-to-date list for our audience to see. However, I'm just copying and pasting. I'm allowed to according to NHTSA, but google will probably flag it right? Is there a way to do this without being penalized?
Thanks,
Ruben
-
I didn't think about other sites, but that's a fabulous point. Best to play it safe.
Thanks for your input!
- Ruben
-
My gut says that your idea to keep the content noindexed is best. Even if the content is unique it borders on the area of auto-generated. Now, I might change your mind if there were a lot of users that were actively interacting with most of these pages. If not though, then you'll end up having a large portion of your site consisting of auto-generated content that doesn't seem useful. Plus...it's also possible that other sites are using this information so you would end up having content that is duplicated on other sites too.
I could be wrong, but my gut says not to try to use this content for ranking purposes.
-
I appreciate the follow up, Marie. Please give me your thoughts on the following idea;
The NHTSA only posts the updates for the past month. If I noindex the page for now (which is what I'm doing) and wait five months, then what would happen? At that point, yes, the current month would be duplicated, but I'd have four months of "unique" content because the NHTSA deletes there's. Plus, I could add pictures of all the automobile, too. Do you think that would be enough to index it?
(I'm most likely going to keep it noindex, because this borders on shady, or at least, I could see google taking it that way, but just as a thought experiment, what do you think?) Or anyone else?
Thanks,
Ruben
-
To expand on EGOL's answer, if you are taking someone else's content (even with their permission) and wanting Google to index it then Google can see that you have a large amount of copied content on your site. This can trigger the Panda filter and can cause Google to consider your whole site as low quality.
You can add a noindex tag as EGOL suggested or you could use a canonical tag to show Google who the originator of the content is, but probably the noindex tag is easiest.
There is one other option as well. If you think it is possible that you can add significant value to the content that is being provided then you can still keep it indexed. If you can combine the recall information with other valuable information then that might be ok to index. But, you have to truly be providing value, not just padding the page with words to make it look unique.
-
Alright, that sounds good. Thanks!
-
I had a bunch of republished articles on my website, done mostly at the request of government agencies and universities. That site got hit in one of the early Panda updates. So, I deleted a lot of that content and to the rest I added this line above the tag
name="robots" content="noindex, follow" />
That tells google not to index the page but to follow the links and allow pagerank to flow through. My site recovered a few weeks later.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search console, duplicate content and Moz
Hi, Working on a site that has duplicate content in the following manner: http://domain.com/content
Intermediate & Advanced SEO | | paulneuteboom
http://www.domain.com/content Question: would telling search console to treat one of them as the primary site also stop Moz from seeing this as duplicate content? Thanks in advance, Best, Paul. http0 -
Trailing Slashes for Magento CMS pages - 2 URLS - Duplicate content
Hello, Can anyone help me find a solution to Fixing and Creating Magento CMS pages to only use one URL and not two URLS? www.domain.com/testpage www.domain.com/testpage/ I found a previous article that applies to my issue, which is using htaccess to redirect request for pages in magento 301 redirect to slash URL from the non-slash URL. I dont understand the syntax fully in htaccess , but I used this code below. This code below fixed the CMS page redirection but caused issues on other pages, like all my categories and products with this error: "This webpage has a redirect loop ERR_TOO_MANY_REDIRECTS" Assuming you're running at domain root. Change to working directory if needed. RewriteBase / # www check If you're running in a subdirectory, then you'll need to add that in to the redirected url (http://www.mydomain.com/subdirectory/$1 RewriteCond %{HTTP_HOST} !^www. [NC]
Intermediate & Advanced SEO | | iamgreenminded
RewriteRule ^(.*)$ http://www.mydomain.com/$1 [R=301,L] Trailing slash check Don't fix direct file links RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.)/$
RewriteRule ^(.)$ $1/ [L,R=301] Finally, forward everything to your front-controller (index.php) RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule .* index.php [QSA,L]0 -
On Page Content. has a H2 Tag but should I also use H3 tags for the sub headings within this body of content
Hi Mozzers, My on page content comes under my H2 tag. I have a few subheadings within my content to help break it up etc and currently this is just underlined (not bold or anything) and I am wondering from an SEO perspective, should I be making these sub headings H3 tags. Otherwise , I just have 500-750 words of content under an H2 tag which is what I am currently doing on my landing pages. thanks pete
Intermediate & Advanced SEO | | PeteC120 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
Trying to advise on what seems to be a duplicate content penalty
So a friend of a friend was referred to me a few weeks ago as his Google traffic fell off a cliff. I told him I'd take a look at it and see what I could find and here's the situation I encountered. I'm a bit stumped at this point, so I figured I'd toss this out to the Moz crowd and see if anyone sees something I'm missing. The site in question is www.finishlinewheels.com In Mid June looking at the site's webmaster tools impressions went from around 20,000 per day down to 1,000. Interestingly, some of their major historic keywords like "stock rims" had basically disappeared while some secondary keywords hadn't budged. The owner submitted a reconsideration request and was told he hadn't received a manual penalty. I figured it was the result of either an automated filter/penalty from bad links, the result of a horribly slow server or possibly a duplicate content issue. I ran the backlinks on OSE, Majestic and pulled the links from Webmaster Tools. While there aren't a lot of spectacular links there also doesn't seem to be anything that stands out as terribly dangerous. Lots of links from automotive forums and the like - low authority and such, but in the grand scheme of things their links seem relevant and reasonable. I checked the site's speed in analytics and WMT as well as some external tools and everything checked out as plenty fast enough. So that wasn't the issue either. I tossed the home page into copyscape and I found the site brandwheelsandtires.com - which had completely ripped the site - it was thousands of the same pages with every element copied, including the phone number and contact info. Furthering my suspicions was after looking at the Internet Archive the first appearance was mid-May, shortly before his site took the nose dive (still visible at http://web.archive.org/web/20130517041513/http://brandwheelsandtires.com) THIS, i figured was the problem. Particularly when I started doing exact match searches for text on the finishlinewheels.com home page like "welcome to finish line wheels" and it was nowhere to be found. I figured the site had to be sandboxed. I contacted the owner and asked if this was his and he said it wasn't. So I gave him the contact info and he contacted the site owner and told them it had to come down and the owner apparently complied because it was gone the next day. He also filed a DMCA complaint with Google and they responded after the site was gone and said they didn't see the site in question (seriously, the guys at Google don't know how to look at their own cache?). I then had the site owner send them a list of cached URLs for this site and since then Google has said nothing. I figure at this point it's just a matter of Google running it's course. I suggested he revise the home page content and build some new quality links but I'm still a little stumped as to how/why this happened. If it was seen as duplicate content, how did this site with no links and zero authority manage to knock out a site that ranked well for hundreds of terms that had been around for 7 years? I get that it doesn't have a ton of authority but this other site had none. I'm doing this pro bono at this point but I feel bad for this guy as he's losing a lot of money at the moment so any other eyeballs that see something that I don't would be very welcome. Thanks Mozzers!
Intermediate & Advanced SEO | | NetvantageMarketing2 -
Blog Duplicate Content
Hi, I have a blog, and like most blogs I have various search options (subject matter, author, archive, etc) which produce the same content via different URLs. Should I implement the rel-canonical tag AND the meta robots tag (noindex, follow) on every page of duplicate blog content, or simply choose one or the other? What's best practice? Thanks Mozzers! Luke
Intermediate & Advanced SEO | | McTaggart0 -
Canonical Not Fixing Duplicate Content
I added a canonical tag to the home page last month, but I am still showing duplicate content for the home page. Here is the tag I added: What am I missing? Duplicate-Content.jpg
Intermediate & Advanced SEO | | InnoInsulation0 -
Duplicate Content Issue
Why do URL with .html or index.php at the end are annoying to the search engine? I heard it can create some duplicate content but I have no idea why? Could someone explain me why is that so? Thank you
Intermediate & Advanced SEO | | Ideas-Money-Art0