Duplicate Articles
-
We submit articles to a magazine which either get posted as text or in a flash container. Management would like to post it to our site as well. I'm sure this has been asked a million times but is this a bad thing to do? Do I need to a rel=canonical tag to the articles? Most of the articles posted to that other site do not contain a link back to our site.
-
The magazine has already given us the ok, like I said they're much more offline focused so it's more about what Google thinks. I think I agree about playing it safe with the canonical tag though. Thanks!
-
If it's really just for your own reference or limited use, I'd probably set up the cross-domain canonical and keep it off of Google's radar. Later, if you wanted to self-publish, you could remove that.
If it's just your site and theirs, it's probably not a high-risk situation. In some ways, it's more about the relationship. If your pages started ranking instead of theirs, I don't know if that goes against your general agreement with them. I'd probably play it safe for now.
-
Our site doesn't have the largest audience yet but management simply wants a place they can go or send clients to easily find everything in one place. The magazine is more for offline advertising but they post it online as well.
-
I'd just add to what Jason said, which I think is generally on-target. If the magazine really is the "source", then posting all those articles again on your site could look "thin" to both users and search engines. In general, you're not ranking for them now, so you probably won't lose out, from an SEO standpoint. There is some risk if you copy a lot of articles, though. You don't want to look like you're scraping your own content, in essence.
The cross-domain rel-canonical should remove the risk of any sort of search penalty or problems. So, again, it's a question of whether it provides value to your site.
At some point, you have to ask - would it make sense to only post them on your site? In other words, if you're building an audience, does it make sense to build it for someone else? Granted, that's a much larger business and marketing decision (far beyond SEO).
-
It's nots a "bad" thing to post the articles in two places, as this type of syndication is somewhat commonplace in the corporate world. Provided your site already as a lot of content and is generally good quality, there's no risk of a penalty for syndicating content.
However, I would encourage management to look at it from the user's perspective: If the user reads the article in the magazine, they're not going to find it very useful to see the same article again on your site. Conversely, if your website visitors aren't going to see the article in the magazine first, why send it to the magazine at all?
One solution is to quote a snippet of the original magazine article on your site, and then write a 200+ word summary or intro for the magazine article that perhaps summarizes the key points, introduces the article in a different way, etc., and then links to the magazine.
From a user's perspective, all the content you've published on your site and in the magazine is unique and potentially useful. From the SEO perspective, there's no possibility of an issue and - unlike syndication - you're adding a unique page of content to your site that is highly likely to be indexed and help you in the long run.
Syndication isn't bad, but you have to ask why you're doing it in the first place. It's often just as easy to create a short "What You'll Learn In This Article" intro on your site than it is to cut-and-paste.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Would this be duplicate content or bad SEO?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
XML feeds in regards to Duplicate Content
Hi everyone I hope you can help. I run a property portal in Spain and am looking for an answer to an issue we are having. We are in the process of uploading an XML feed to our site which contains 10,000+ properties relating to our niche. Although this is great for our customers I am aware this content is going to be duplicated from other sites as our clients advertise over a range of portals. My question is, are there any measures I can take to safeguard our site from penalisation from Google? Manually writing up 10,000 + descriptions for properties is out of the question sadly. I really hope somebody can help Thanks Steve
White Hat / Black Hat SEO | | buysellrentspain0 -
Duplicate content showing on local pages
I have several pages which are showing duplicate content on my site for web design. As its a very competitive market I had create some local pages so I rank high if someone is searching locally i.e web design birmingham, web design tamworth etc.. http://www.cocoonfxmedia.co.uk/web-design.html http://www.cocoonfxmedia.co.uk/web-design-tamworth.html http://www.cocoonfxmedia.co.uk/web-design-lichfield.html I am trying to work out what is the best way reduce the duplicate content. What would be the best way to remove the duplicate content? 1. 301 redirect (will I lose the existing page) to my main web design page with the geographic areas mentioned. 2. Re write the wording on each page and make it unique? Any assistance is much appreciated.
White Hat / Black Hat SEO | | Cocoonfxmedia0 -
Will cleaning up old pr articles help serps?
For a few years we published articles with anchor text backlinks to about 10 different article submission sites. Each article was modified to create similar different articles. We have about 50 completely unique articles. This worked really well for our serps until google panda & penguin updates. I am looking for advice on whether I should have a major clean up of the published articles and if so should I be deleting them, removing or renaming anchor text backlinks? Any advice on what strategy would work best would be appreciated as I don't want to start deleting backlinks and making it worse. We used to enjoy position 1 but are now at 12-15 so have least most of our traffic.
White Hat / Black Hat SEO | | devoted2vintage0 -
Competitors and Duplicate Content
I'm curious to get people's opinion on this. One of our clients (Company A) has a competitor that's using duplicate sites to rank. They're using "www.companyA.com" and "www.CompanyAIndustryTown.com" (actually, several of the variations). It's basically duplicate content, with maybe a town name inserted or changed somewhere on the page. I was always told that this is not a wise idea. They started doing this in the past month or so when they had a site redesign. So far, it's working pretty well for them. So, here's my questions: -Would you address this directly (report to Google, etc.)? -Would you ignore this? -Do you think it's going to backfire soon? There's another company (Company B) that's using another practice- using separate pages on their domain to address different towns, and using those as landing pages. Similar, in that a lot of the content is the same, just some town names and minor details changed. All on the same domain though. Would the same apply to that? Thanks for your insight!
White Hat / Black Hat SEO | | DeliaAssociates0 -
Links In Blog Posts: 1 Paragraph VS. Full Article
Hey guys, I've been using an article network to post unique articles (not spun). Been posting 1 paragraph articles with 1 text link. Just wondering what the main difference would be if I were to post a full article with 2 or 3 text links vs 1 paragraph with 1 text link, besides the fact that you get more links and save more time writing only 1 paragraph. Will the full article with 3 backlinks improve keyword ranks more or not by much? Cheers!
White Hat / Black Hat SEO | | upick-1623910