Our Site's Content on a Third Party Site--Best Practices?
-
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content.
I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site.
Our thoughts so far:
-
add a paragraph of original content to our content
-
link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties)
What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site?
They are really pushing for not using a canonical--so this isn't an option. What would you do?
-
-
Google doesn't say 'don't syndicate content' they say 'syndicate carefully' and include a link back to the original source: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66359
-
I think our site would be fine given that:
a) we published the content first (its already been indexed in google)
b) this is content syndication -- not scraping. We are permitting our client to use our content.
c) there will be a link back to us, in the form of a byline, to identify us as the original source of each article.
This is a major client for us, and they really don't want to use the canonical tag, so I''m looking for advice/ best practices / ideas
-
Michelle - HOLD ON there!
URL suicide right there!
No way at all do you want to post duplicate content - even spun content.
Authentic, Authentic Authentic!
Plus in a post penguin/panda world - you are really walking on thin ice.
Grey hat + Black hat = no hat of mine.
Trust me - getting authentic content from a client will be like getting a hamburger at a veagan road side vendor - but YOU GOT TO!
Your pal,
Chenzo
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Print pages returning 404's
Print pages on one of our sister sites are returning 404's in our crawl but are visible when clicked on. Here is one example: https://www.theelementsofliving.com/recipe/citrus-energy-boosting-smoothie/print Any ideas as to why these are returning errors? Thank you!
Intermediate & Advanced SEO | | FirstService0 -
What's the best possible URL structure for a local search engine?
Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
Intermediate & Advanced SEO | | _nitman0 -
SEO effect of content duplication across hub of sites
Hello, I have a question about a website I have been asked to work on. It is for a real estate company which is part of a larger company. Along with several other (rival) companies it has a website of property listings which receives a feed of properties from a central hub site - so lots of potential for page, title and meta content duplication (if if isn't already occuring) across the whole network of sites. In early investigation I don't see any of these sites ranking very well at all in Google for expected search phrases. Before I start working on things that might improve their rankings, I wanted to ask some questions from you guys: 1. How would such duplication (if it is occuring) effect the SEO rankings of such sites individually, or the whole network/hub collectively? 2. Is it possible to tell if such a site has been "burnt" for SEO purposes, especially if or from any duplication? 3. If such a site or the network has been totally burnt, are there any approaches or remedies that can be made to improve the site's SEO rankings significantly, or is the only/best option to start again from scratch with a brand new site, ensuring the use of new meta descriptions and unique content? Thanks in advance, Graham
Intermediate & Advanced SEO | | gmwhite9991 -
301's, Mixed-Case URLs, and Site Migration Disaster
Hello Moz Community, After placing trust in a developer to build & migrate our site, the site launched 9 weeks ago and has been one disaster after another. Sadly, after 16 months of development, we are building again, this time we are leveled-up and doing it in-house with our people. I have 1 topic I need advice on, and that is 301s. Here's the deal. The newbie developer used a mixed-case version for our URL structure. So what should have been /example-url became /Example-Url on all URLs. Awesome right? It was a duplicate content nightmare upon launch (among other things). We are re-building now. My question is this, do we bite the bullet for all URLs and 301 them to a proper lower-case URL structure? We've already lost a lot of link equity from 301ing the site the first time around. We were a PR 4 for the last 5 years on our homepage, now we are a PR 3. That is a substantial loss. For our primary keywords, we were on the first page for the big ones, for the last decade. Now, we are just barely cleaving to the second page, and many are 3rd page. I am afraid if we 301 all the URLs again, a 15% reduction in link equity per page is really going to hurt us, again. However, keeping the mixed-case URL structure is also a whammy. Building a brand new site, again, it seems like we should do it correctly and right all the previous wrongs. But on the other hand, another PR demotion and we'll be in line at the soup kitchen. What would you do?
Intermediate & Advanced SEO | | yogitrout10 -
What's the best SEO practice for having dynamic content on the same URL?
Let's use this example... www.miniclip.com and there's a function to log in... If you're logged in and a cookie checks that you're logged in and you're on page, let's say, www.miniclip.com/racing-games however the banners being displayed would have more call to action and offers on the page when a user is not logged in to entice them to sign up but the URL would still be www.miniclip.com/racing-games if and if not logged in, what would be the best URL practice for this? just do it?
Intermediate & Advanced SEO | | AdiRste0 -
Our site has been penalized and it's proving to be very hard to get our rankings back...
So I have a question. We have used nearly every trick in the book to rank our site, including a ton of white hat stuff.... but then also a lot of black hat practices that resulted in us dropping in the rankings by about 30-40 positions. And getting back to where we were (top 10 for most keywords) is proving to be nearly impossible. We have a ton of great content coming off of the site and we actually offer a quality product. We follow most of the guidelines advocated here on SEOmoz. But the black hat stuff we did has really taken a toll. And it's gonna be pretty much impossible to go back in time and erase all of the Black Hat stuff we did. So what should we do? Should we design a completely new website with a new domain? What can be done to help?
Intermediate & Advanced SEO | | LilyRay0 -
Is This 301 Use Best Practice??
I know its effective practice cuz we're getting our arse kicked. I'm curious if its best practice (white, gray or black hat). I'm checking a competitors link profile on its landing page that is hitting the top of page 1 for several keywords. This competitor (national chain) has a strong domain authority (69). The particular landing page I'm checking in OSE has two 301 redirects from its own site among some other directory links to the page. The page shows 15 external links and half of them are very strong including it's own 301's. Aren't they essentially sending their own juice to the landing page to bolster page/domain authority to rank higher in the SERPS for those keywords? Is this a common practice using the 301's to a landing page? Is it white, gray or black hat? They are appearing suddenly appearing on the first page for several category keywords, so we're doing some snooping. Thanks.
Intermediate & Advanced SEO | | AWCthreads0 -
Best solution to get mass URl's out the SE's index
Hi, I've got an issue where our web developers have made a mistake on our website by messing up some URL's . Because our site works dynamically IE the URL's generated on a page are relevant to the current URL it ment the problem URL linked out to more problem URL's - effectively replicating an entire website directory under problem URL's - this has caused tens of thousands of URL's in SE's indexes which shouldn't be there. So say for example the problem URL's are like www.mysite.com/incorrect-directory/folder1/page1/ It seems I can correct this by doing the following: 1/. Use Robots.txt to disallow access to /incorrect-directory/* 2/. 301 the urls like this:
Intermediate & Advanced SEO | | James77
www.mysite.com/incorrect-directory/folder1/page1/
301 to:
www.mysite.com/correct-directory/folder1/page1/ 3/. 301 URL's to the root correct directory like this:
www.mysite.com/incorrect-directory/folder1/page1/
www.mysite.com/incorrect-directory/folder1/page2/
www.mysite.com/incorrect-directory/folder2/ 301 to:
www.mysite.com/correct-directory/ Which method do you think is the best solution? - I doubt there is any link juice benifit from 301'ing URL's as there shouldn't be any external links pointing to the wrong URL's.0