An article we wrote was published on the Daily Business Review, we'd like to post it on our site. What is the proper way?
-
Part 1
We wrote an article and submitted it to the Daily Business Review. They published the article on their website.We want to also post the article on our website for our users but we want to make sure we are doing this properly. We don't want to be penalized for duplicating content. Is this the correct way to handle this scenario written below?
- We added a rel="canonical" to the blog post (on our website). The rel="canonical" is set to the Daily Business Review URL where the article was originally published.
- At the end of the blog post we wrote. "This article was originally posted on The Daily Business Review." and we link to the original post on the Daily Business Review.
Should we be setting the blog post (on our website) to be a "noindex" or rel="canonical" ?
Part 2 Our company was mentioned in a number of articles. We DID NOT write those articles, we were only mentioned. We have also posted those same articles on our website (verbatim from the original article). We want to show our users that we have been mentioned in highly credited articles. All of these articles were posted on our website and are set to be a "noindex". Is that the correct thing to do? Should we be using a rel="canonical" instead and pointing to the original article URL?
Thanks in advance MOZ community for your assistance! We tried to do the leg work of our own research for the answers but couldn't find the exact same scenario that we are encountering**.**
-
Whether or not you're allowed to copy and paste the article verbatim is something you'll have to determine from the site you copied from, but even noindex wouldn't address the problem of plagiarism if that's what you're worried about as the article would still be on your site. Basically what you're doing is the reverse of what's in the Google guide on Canonical:
_Content you provide on that blog for syndication to other sites is replicated in part or in full on those domains. _
http://news.example.com/green-dresses-for-every-day-155672.html (syndicated post)http://blog.example.com/dresses/green-dresses-are-awesome/3245/ (original post)
So in this case the News site (The Daily Business Review) is the source of the article, and you're one of the sites syndicating what they wrote so you point back to them as canonical. Still the questions you bring up are part of the reason why several sites--HuffPo, The Verge, SlashDot, etc--write their own take on a source article instead of reprinting verbatim when linking back. It's more of the annotation model I mentioned above.
-
Setting SEO aside for the moment, in both situations, make sure you have permission to reprint the articles on your site.
-
Hi Ryan,
Thank you very much for taking the time to respond!
I just want to make sure I understand you correctly. Are you suggesting that the blog posts that WERE NOT written by us and only mentioned our firm should be set to a rel="canonical" instead of a "noindex" since we reposted them on our own site? Is setting the copied article to be "noindex" technically the incorrect thing to do? We thought that since we copied the article verbatim and it wasn't our original work that Google shouldn't index this page on our website.
-
Hi Pete. Using rel=canonical would be a better implementation as your site showing up for a search on these articles is perfectly acceptable since they're about your site. There are also several other design ways in which you can link back to the original published article...
- Annotation. Instead of republishing the entire article you can quote bits from it and highlight what service/product/thing your company does in relation to the quote. It could perhaps be an expansion like, "We also make this in custom colors..." a clarification, "This is now a permanent service..." or any other applicable detail really.
- Screen cap. Some sites churn through articles so an archived screen grab of the article is nice to show the press you got. Photos are especially handy for when you show up in print.
- A brand scroll. Lots of sites add the logos of well know brands that have written about them titled something like, "What people are saying" and then showing the logo of various sites: the verge, wired, tech crunch, etc. and linking to the article via the logo.
So I'd get rid of the noindex tag. Me finding your site as a result next to the Daily Business Review site would make my user experience better as the search is returning the correlation even before I click through to read the sources.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Main Site and eCommerce Site URLs for SEO
My client currently has a main website on a url and an eCommerce site on a subdomain. The eCommerce site is currently not mobile friendly, has images that are too small and are problematic - and I believe it negates some of the SEO work we do for them. I had to turn off Google Shopping ads because the quality score was so low. That being said, they are rebuilding a shopping cart on a new platform that will be mobile friendly BUT the images are going to be tiny until they slowly replace images over several months. Would you keep the shopping cart on a subdomain, or make it part of the main website URL? Can it negatively impact the progress we have made on the main site SEO.
Technical SEO | | jerrico10 -
What to do about removing pages for the 'offseason' (IE the same URL will be brought back in 6-7 months)?
I manage a site for an event that runs annually, and now that the event has concluded we would like to remove some of the pages (schedule, event info, TV schedule, etc.) that won't be relevant again until next year's event. That said, if we simply remove those pages from the web, I'm afraid that we'll lose out on valuable backlinks that already exist, and when those pages return they will have the same URLs as before. Is there a best course of action here? Should I redirect the removed pages to the homepage for the time being using a 302? Is there any risk there if the 'temporary' period is ~7 months? Thanks in advance.
Technical SEO | | KTY550 -
Mobile site ranking instead of/as well as desktop site in desktop SERPS
I have just noticed that the mobile version of my site is sometimes ranking in the desktop serps either instead of as well as the desktop site. It is not something that I have noticed in the past as it doesn't happen with the keywords that I track, which are highly competitive. It is happening for results that include our brand name, e.g '[brand name][search term]'. The mobile site is served with mobile optimised content from another URL. e.g wwww.domain.com/productpage redirects to m.domain.com/productpage for mobile. Sometimes I am only seen the mobile URL in the desktop SERPS, other times I am seeing both the desktop and mobile URL for the same product. My understanding is that the mobile URL should not be ranking at all in desktop SERPS, could we be being penalised for either bad redirects or duplicate content? Any ideas as to how I could further diagnose and solve the problem if you do believe that it could be harming rankings?
Technical SEO | | pugh0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Can Silos and Exact Anchor Text In Links Hurt a Site Post Penguin?
Just got a client whose site dropped from a PR of 3 to zero. This happened shortly after the Penguin release, June, 2012. Examining the site, I couldn't find any significant duplicate content, and where I did find duplicate content (9%), a closer look revealed that the duplication was totally coincidental (common expressions). Looking deeper, I found no sign of purchased links or linking patterns that would hint at link schemes, no changes to site structure, no change of hosting environment or IP address. I also looked at other factors, too many to mention here, and found no evidence of black hat tactics or techniques. The site is structured in silos, "services", "about" and "blog". All page titles that fall under services are categorized (silo) under "services", all blog entries are categorized under "blogs", and all pages with company related information are categorized under "about". When exploring the site's links in Site Explorer (SE), I noticed that SE is identifying the "silo" section of links (i.e. services, about, blog, etc.) and labeling it as an anchor text. For example, domain.com/(services)/page-title, where the page title prefix (silo), "/services/", is labeled as an anchor text. The same is true for "blog" and "about". BTW, each silo has its own navigational menu appearing specifically for the content type it represents. Overall, though there's plenty of room for improvement, the site is structured logically. My question is, if Site Explorer is picking up the silo (services) and identifying it as an anchor text, is Google doing the same? That would mean that out of the 15 types of service offerings, all 15 links would show as having the same exact anchor text (services). Can this type of site structure (silo) hurt a website post Penguin?
Technical SEO | | UplinkSpyder0 -
I cannot find a way to implement to the 2 Link method as shown in this post: http://searchengineland.com/the-definitive-guide-to-google-authorship-markup-123218
Did Google stop offering the 2 link method of verification for Authorship? See this post below: http://searchengineland.com/the-definitive-guide-to-google-authorship-markup-123218 And see this: http://www.seomoz.org/blog/using-passive-link-building-to-build-links-with-no-budget In both articles the authors talk about how to set up Authorship snippets for posts on blogs where they have no bio page and no email verification just by linking directly from the content to their Google+ profile and then by linking the from the the Google+ profile page (in the Contributor to section) to the blog home page. But this does not work no matter how many ways I trie it. Did Google stop offering this method?
Technical SEO | | jeff.interactive0 -
Does it sound like a linkwheel to you?
I have a small group of insurance related blogs and sites (20). I have been creating useful articles and linked them to my main site. These sites are "not" interlinked and they are all in different C-classes ip. I was thinking in linking some of them to some other good resources that I have in article sites like ezinearticles.com with the idea of making my articles available to my readers and also increasing the rate that they get crawl by the bots. Maybe even boosting does pages Page Authority. Thanks, Gabe
Technical SEO | | ggplaylist0 -
Site 'filtered' by Google in early July.... and still filtered!
Hi, Our site got demoted by Google all of a sudden back in early July. You can view the site here: http://alturl.com/4pfrj and you may read the discussions I posted in Google's forums here: http://www.google.com/support/forum/p/Webmasters/thread?tid=6e8f9aab7e384d88&hl=en http://www.google.com/support/forum/p/Webmasters/thread?tid=276dc6687317641b&hl=en Those discussions chronicle what happened, and what we've done since. I don't want to make this a long post by retyping it all here, hence the links. However, we've made various changes (as detailed), such as getting rid of duplicate content (use of noindex on various pages etc), and ensuring there is no hidden text (we made an unintentional blunder there through use of a 3rd party control which used CSS hidden text to store certain data). We have also filed reconsideration requests with Google and been told that no manual penalty has been applied. So the problem is down to algorithmic filters which are being applied. So... my reason for posting here is simply to see if anyone here can help us discover if there is anything we have missed? I'd hope that we've addressed the main issues and that eventually our Google ranking will recover (ie. filter removed.... it isn't that we 'rank' poorly, but that a filter is bumping us down, to, for example, page 50).... but after three months it sure is taking a while! It appears that a 30 day penalty was originally applied, as our ranking recovered in early August. But a few days later it dived down again (so presumably Google analysed the site again, found a problem and applied another penalty/filter). I'd hope that might have been 30 or 60 days, but 60 days have now passed.... so perhaps we have a 90 day penalty now. OR.... perhaps there is no time frame this time, simply the need to 'fix' whatever is constantly triggering the filter (that said, I 'feel' like a time frame is there, especially given what happened after 30 days). Of course the other aspect that can always be worked on (and oft-mentioned) is the need for more and more original content. However, we've done a lot to increase this and think our Guide pages are pretty useful now. I've looked at many competitive sites which list in Google and they really don't offer anything more than we do..... so if that is the issue it sure is puzzling if we're filtered and they aren't. Anyway, I'm getting wordy now, so I'll pause. I'm just asking if anyone would like to have a quick look at the site and see what they can deduce? We have of course run it through SEOMoz's tools and made use of the suggestions. Our target pages generally rate as an A for SEO in the reports. Thanks!
Technical SEO | | Go2Holidays0