Duplicate content site not penalized
-
Was reviewing a site, www.adspecialtyproductscatalog.com, and noted that even though there are over 50,000 total issues found by automated crawls, including 3000 pages with duplicate titles and 6,000 with duplicate content this site still ranks high for primary keywords. The same essay's worth of content is pasted at the bottom of every single page. What gives, Google?
-
Thanks SEOMAN.
My issue is that this purposeful duplicate content tactic is helping www.adspecialtyproductscatalog.com rank higher for a keyword like "ad specialty products" than http://www.advertisingproducts.net/, the former, using the same 4,000 words on every single page of it's site.
So if the goal is to rank #1 for the highest searched keyword in my category, should one employ this content tactic as part of their SEO efforts?
-
This is a common scenario, I would take issues found by crawling tools with a pinch of salt - A lot of them flag up errors that aren't actually a major problem.
Search engines aren't crawling the web to find problems with your site, they are crawling the web to find the most relevant content and deliver it to the person that requests it.
As far as I'm aware there is no such thing as a duplicate content penalty. But there is an exception, if you breach copyright you could be in legal trouble. Additionally Google does have a pirated content removal process, if you find a copied piece of content that breaches copyright you can submit it and it will most likely be removed completely from the search engine.
If you are talking about duplicate content on your own site, it isn't a major problem and Google will still pick up the pages and try and deliver the best content to the search except you are doing yourself a disservice. If you have duplicate content on your own site put a bit of effort into crafting content and targeting different keywords.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz was unable to crawl your site? Redirect Loop issue
Moz was unable to crawl your site on Jul 25, 2017. I am getting this message for my site: It says "unable to access your homepage due to a redirect loop. https://kuzyklaw.com/ Site is working fine and last crawled on 22nd July. I am not sure why this issue is coming. When I checked the website in Chrome extension it saysThe server has previously indicated this domain should always be accessed via HTTPS (HSTS Protocol). Chrome has cached this internally, and did not connect to any server for this redirect. Chrome reports this redirect as a "307 Internal Redirect" however this probably would have been a "301 Permanent redirect" originally. You can verify this by clearing your browser cache and visiting the original URL again. Not sure if this is actual issue, This is migrated on Https just 5 days ago so may be it will resolved automatically. Not sure, can anybody from Moz team help me with this?
White Hat / Black Hat SEO | | CustomCreatives0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
How do I make a content calendar to increase my rank for a key word?
I've watched more than a few seminars on having a content calendar. Now I'm curious as to what I would need to do to increase ranking for a specific keyword in local SEO. Let's say I wanted to help them increase their rank for used trucks in buffalo, NY. Would I regularly publish blog posts about used trucks? Thanks!
White Hat / Black Hat SEO | | oomdomarketing0 -
Am I Syndicating Content Correctly?
My question is about how to syndicate content correctly. Our site has professionally written content aimed toward our readers, not search engines. As a result, we have other related websites who are looking to syndicate our content. I have read the Google duplicate content guidelines (https://support.google.com/webmasters/answer/66359?hl=en), canonical recommendations (https://support.google.com/webmasters/answer/139066?hl=en&ref_topic=2371375), and no index recommendation (https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag) offered by Google, but am still a little confused about how to proceed. The pros in our opinion are as follows:#1 We can gain exposure to a new audience as well as help grow our brand #2 We figure its also a good way to help build up credible links and help our rankings in GoogleOur initial reaction is to have them use a "canonical link" to assign the content back to us, but also implement a "no index, follow" tag to help avoid duplicate content issues. Are we doing this correctly, or are we potentially in threat of violating some sort of Google Quality Guideline?Thanks!
White Hat / Black Hat SEO | | Dirving4Success0 -
Closing down site and redirecting its traffic to another
OK - so we currently own two websites that are in the same industry. Site A is our main site which hosts real estate listings and rentals in Canada and the US. Site B hosts rentals in Canada only. We are shutting down site B to concentrate solely on Site A, and will be looking to redirect all traffic from Site B to Site A, ie. user lands on Toronto Rentals page on Site B, we're looking to forward them off to Toronto Rentals page on Site A, and so on. Site A has all the same locations and property types as Site B. On to the question: We are trying to figure out the best method of doing this that will appease both users and the Google machine. Here's what we've come up with (2 options): When user hits Site B via Google/bookmark/whatever, do we: 1. Automatically/instantly (301) redirect them to the applicable page on Site A? 2. Present them with a splash page of sorts ("This page has been moved to Site A. Please click the following link <insert anchor="" text="" rich="" url="" here="">to visit the new page.").</insert> We're worried that option #1 might confuse some users and are not sure how crawlers might react to thousands of instant redirects like that. Option #2 would be most beneficial to the end-user (we're thinking) as they're being notified, on page, of what's going on. Crawlers would still be able to follow the URL that is presented within the splash write-up. Thoughts? We've never done this before. It's basically like one site acquiring another site; however, in this case, we already owned both sites. We just don't have time to take care of Site B any longer due to the massive growth of Site A. Thanks for any/all help. Marc
White Hat / Black Hat SEO | | THB0 -
Month old site and alreasdy ranks 3 for competitive keyword
I know this individual does this with several sites and then offers them for sale to his competitors. Obviously spammy thru and thru, but how can google reward a site thats not even two months old, with 1900 + links with a ranking of #3 for a highly competitive keyword? Please dont post the actual name or url of the website as we dont want to give him any more credit but this blows my mind as he has done this several times with other sites and never gets penalized. http://tinyurl.com/b9jysa5 Any ideas as to how he can accomplish this besides almost 2000 links in less than 2 months? How is that even remotely natural? I know his other sites have been reported to google but they never did anything about it. Thanks for any feedback.
White Hat / Black Hat SEO | | anthonytjm0 -
Can I be penalized for offering incentives for links and social followers?
A competitor of mine is using contest/loyalty software like ContestBurner or PunchTab to generate social followers and links. This has been very successful, and over the past several months his rankings have improved. Does anyone know if Google is "OK" with this type of program? I'm trying to decide if I should start one myself.
White Hat / Black Hat SEO | | dfeemster1