Our Site's Content on a Third Party Site--Best Practices?
-
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content.
I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site.
Our thoughts so far:
-
add a paragraph of original content to our content
-
link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties)
What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site?
They are really pushing for not using a canonical--so this isn't an option. What would you do?
-
-
Google doesn't say 'don't syndicate content' they say 'syndicate carefully' and include a link back to the original source: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66359
-
I think our site would be fine given that:
a) we published the content first (its already been indexed in google)
b) this is content syndication -- not scraping. We are permitting our client to use our content.
c) there will be a link back to us, in the form of a byline, to identify us as the original source of each article.
This is a major client for us, and they really don't want to use the canonical tag, so I''m looking for advice/ best practices / ideas
-
Michelle - HOLD ON there!
URL suicide right there!
No way at all do you want to post duplicate content - even spun content.
Authentic, Authentic Authentic!
Plus in a post penguin/panda world - you are really walking on thin ice.
Grey hat + Black hat = no hat of mine.
Trust me - getting authentic content from a client will be like getting a hamburger at a veagan road side vendor - but YOU GOT TO!
Your pal,
Chenzo
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A client rebranded a few years ago and doesn't want to be associated with it's old brand name. He wishes not to appear when the old brand is searched in Google, is there something we can do?
The problem is there was redirection between the old branded site and the new one, and now when you type in the name of the old brand, the new one comes up. I have desperately tried to convince this client there is nothing we can do about it, dozens of news articles crop up with the two brands together as this was a hot topic a few years ago, but just in case I missed something I thought I'd ask the community of experts here on Moz. An example for this would be Tyco Healthcare that became covidien in 2007. When you type tyco healthcare, covidien crops up here and there. Any ideas? Thanks!
Intermediate & Advanced SEO | | Netsociety0 -
Faceted Navigation URLs Best Practices
Hi, We are developing new Products Pages with faceted filters. You can see it here: https://www.viatrading.com/wholesale-products/ We have a feature allowing to Order By and Group By, which alters the order of all products. There will also be the option to view Products as a table, which will contain same products but with different design and maybe slightly different content of each product. All this will happen without changing the URL, https://www.viatrading.com/all/ Is this the best practice? Thanks,
Intermediate & Advanced SEO | | viatrading10 -
2 eCommerce stores that are identical 1 for US 1 for CA, what's the best way to SEO?
Hello everyone! I have an SEO question that I cannot solve given the parameters of the project, and I was wondering if someone could provide me with the next best alternative to my situation. Thank you in advance. The problem: Two eCommerce stores are completely identical (structure, products, descriptions, content) but they are on separate domains for currency and targeting purposes. www.website-can.com is for Canada and www.website-usa.com is for US. Due to exchange rate issues, we are unable to combine the 2 domains into 1 store and optimize. What's been done? I have optimized the Canadian store with unique meta titles and descriptions for every page and every product. However I have left the US store untouched. I would like to gain more visibility for the US Store but it is very difficult to create unique content considering the products are identical. I have evaluated using canonicals but that would ask Google to only look at either the Canadian or US store, , correct me if i'm wrong. I am looking for the next best solution given the challenges and I was wondering if someone could provide me with some ideas.
Intermediate & Advanced SEO | | Snaptech_Marketing0 -
Best support site software to use
Hi Guys We currently use Desk to run our company support site, it seems ok (I don't administer it), however is it very template driven and doesn't allow useful tools such as being able to add metadata to each page (hence in our Moz crawl tests we get a large number of no metadata errors (which seems like a lost opportunity for us to optimise the site). Our support team are looking to implement MadCap Flare as an information management tool, however this tool outputs HTML as iframes which obviously make it hard for google to crawl the content. We recently implemented HubSpot as our content marketing platform which is great, and we'd love to have the support site hosted on this (great for tracking traffic etc), however as far as I'm aware MadCap Flare doesn't integrate directly with HubSpot....so looking for suggestions on what others are successfully using to host/manage their SEO optimised support sites? Cheers Matt
Intermediate & Advanced SEO | | SnapComms0 -
Best practice for H1 on site without H1 - Alternative methods?
I have recently set up a mens style blog - the site is made up of articles pulled in from a CMS and I am wanting to keep the design as clean as possible - so no text other than the articles. This makes it hard to get a H1 tag into the page - are there any solutions/alternatives? that would be good for SEO? The site is http://www.iamtheconnoisseur.com/ Thanks
Intermediate & Advanced SEO | | SWD.Advertising0 -
How can I get a list of every url of a site in Google's index?
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
Is Google's reinclusion request process flawed?
We have been having a bit of a nightmare with a Google penalty (please see http://www.browsermedia.co.uk/2012/04/25/negative-seo-or-google-just-getting-it-painfully-wrong/ or http://econsultancy.com/uk/blog/10093-why-google-needs-to-be-less-kafkaesque for background information - any thoughts on why we have been penalised would be very, very welcome!) which has highlighted a slightly alarming aspect of Google's reinclusion process. As far as I can see (using Google Analytics), supporting material prepared as part of a reinclusion request is basically ignored. I have just written an open letter to the search quality team at http://www.browsermedia.co.uk/2012/06/19/dear-matt-cutts/ which gives more detail but the short story is that the supporting evidence that we prepared as part of a request was NOT viewed by anyone at Google. Has anyone monitored this before and experienced the same thing? Does anyone have any suggestions regarding how to navigate the treacherous waters of resolving a penalty? This no doubt sounds like a sob story for us, but I do think that this is a potentially big issue and one that I would love to explore more. If anyone could contribute from the search quality team, we would love to hear your thoughts! Cheers, Joe
Intermediate & Advanced SEO | | BrowserMediaLtd0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0