Our Site's Content on a Third Party Site--Best Practices?
-
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content.
I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site.
Our thoughts so far:
-
add a paragraph of original content to our content
-
link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties)
What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site?
They are really pushing for not using a canonical--so this isn't an option. What would you do?
-
-
Google doesn't say 'don't syndicate content' they say 'syndicate carefully' and include a link back to the original source: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66359
-
I think our site would be fine given that:
a) we published the content first (its already been indexed in google)
b) this is content syndication -- not scraping. We are permitting our client to use our content.
c) there will be a link back to us, in the form of a byline, to identify us as the original source of each article.
This is a major client for us, and they really don't want to use the canonical tag, so I''m looking for advice/ best practices / ideas
-
Michelle - HOLD ON there!
URL suicide right there!
No way at all do you want to post duplicate content - even spun content.
Authentic, Authentic Authentic!
Plus in a post penguin/panda world - you are really walking on thin ice.
Grey hat + Black hat = no hat of mine.
Trust me - getting authentic content from a client will be like getting a hamburger at a veagan road side vendor - but YOU GOT TO!
Your pal,
Chenzo
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I Add Location to ALL of My Client's URLs?
Hi Mozzers, My first Moz post! Yay! I'm excited to join the squad 🙂 My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc. I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like: example.com/weddings/planners-washington-dc-md-va
Intermediate & Advanced SEO | | pdrama231
example.com/weddings/djs-washington-dc-md-va
example.com/weddings/ballroom-lighting-washington-dc-md-va OR example.com/weddings/planners
example.com/weddings/djs
example.com/weddings/ballroom-lighting In both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it. Thoughts? Thank you!!0 -
Does DMCA protection actually improve search rankings (assuming no one's stolen my content)
Hello Moz Community, I had a conversation with someone who claimed that implementing a DMCA protection badge, such as those offered at http://www.dmca.com/ for $10/mo, will improve a site's Google rankings. Is this true? I know that if my content is stolen it can hurt my rankings (or the stolen content can replace mine), but I'm asking if merely implementing the badge will help my rankings. Thanks! Bill
Intermediate & Advanced SEO | | Bill_at_Common_Form0 -
What is Best Way to Scale RCS Content?
SEO has really moved away from the nitty gritty analysis of backlinking factors, link wheels, and the like and has shifted to a more holistic marketing approach. That approach is best described around MOZ as “Real Company S#it”. RCS is a great way to think about what we really do because it is so much more than just SEO or just Social Media. However, our clients and business owners do want to see results and want it quantified in some way. The way most of our clients understand SEO is by ranking high on specific terms or online avenues they have a better possibility of generating traffic/sales/revenue. They understand this more from the light of traditional marketing, where you pay for a TV ad and then measure to see how much revenue that ad generated. In the light of RCS and the need to target a large number of keywords for a given client, how do most PROs handle this situation; where you have a large number of keywords to target but with RCS? Many I’ve asked tend to use the traditional approach of creating a single content piece that is geared towards a given target keyword. However, that approach can get daunting if you have say 25 keywords that a small business wants to target. In this case is not really a case of scaling down the client expectations? What if the client wants all of the keywords and has the budget? Do you just ramp your RCS content creation efforts? It seems that you can do overkill and quickly run out of RCS content to produce.
Intermediate & Advanced SEO | | AaronHenry0 -
Best practice to avoid cannibalization of internal pages
Hi everyone, I need help from the best SEO guys regarding a common issue : the cannibalization of internal pages between each other. Here is the case : Let's say I run the website CasualGames.com. This website provides free games, as well as articles and general presentation about given categories of Casual Games. For instance, for the category "Sudoku Games", the structure will be : Home page of the game : http://www.casualgames.com/sudoku/ Free sudoku game listings : (around 100 games listed) http://www.casualgames.com/sudoku/free/ A particular sudoku game : http://www.casualgames.com/sudoku/free/game-1/ A news regarding sudoku games : http://www.casualgames.com/sudoku/news/title The problem is that these pages seem to "cannibalize" each other. Explanation : In the SERPS, for the keyword "Casual Games", the home page doesn't appear well ranked and some specific sudoku games page (one of the 100 games) are better ranked although they are "sub-pages" of the category.. Same for the news pages : a few are better ranked than the category page.. I am kind of lost.. Any idea what would be the best practice in this situation? THANKS a LOT.
Intermediate & Advanced SEO | | laboiteac
Guillaume0 -
How to find all of a website's SERPs?
Was wondering how easiest to find all of a website's existing SERPs?
Intermediate & Advanced SEO | | McTaggart0 -
Best practice for listings with outbound links
My site contains a number of listings for charities that offer various sporting activities for people to get involved in order to raise money. As part of the listing we provide an outbound link for the user to find out more info about each of the charities and their activities. Currently these listings are blocked in the robots.txt for fear that we may be viewed as a 'link farm or spam site' (as there are hundreds of charities listed on the scrolling page) but these links out are genuine and provide benefits and are a useful resource for the user and not paid links. What I'd like to do is make these listings fully crawlable and indexable to increase our search traffic to these listing, but I'm not sure whether this would have a negative impact on our Pagerank with Google potentially viewing all these outbound links as 'bad' or 'paid links', Would removing the listing pages from our robots.txt and making all the outbound links 'nofollow' be the way forward to allow us to properly index the listings without being penalised as some kind of link farm or spam site? (N.B. I have no interest in passing link juice to the external charity websites)
Intermediate & Advanced SEO | | simon_realbuzz0 -
Don't want to lose page rank, what's the best way to restructure a url other than a 301 redirect?
Currently in the process of redesigning a site. What i want to know, is what is the best way for me to restructure the url w/out it losing its value (page rank) other than a 301 redirect?
Intermediate & Advanced SEO | | marig0