How to Best Establish Ownership when Content is Duplicated?
-
A client (Website A) has allowed one of their franchisees to use some of the content from their site on the franchisee site (Website B). This franchisee lifted the content word for word, so - my question is how to best establish that Website A is the original author?
Since there is a business relationship between the two sites, I'm thinking of requiring Website B to add a rel=canonical tag to each page using the duplicated content and referencing the original URL on site A.
Will that work, or is there a better solution?
This content is primarily informational product content (not blog posts or articles), so I'm thinking rel=author may not be appropriate.
-
Thanks grasshopper!
-
No; site B is using Spry, a now defunct Adobe framework: http://blogs.adobe.com/dreamweaver/2012/08/update-on-adobe-spry-framework-availability.html
Site A uses ExpressionEngine.
-
Hi Allie,
You're absolutely on the right track. Cross-domain canonicalization was created to deal with situations exactly like the one you're describing. I've used it many times, and it works great.
http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html
-
Im pretty sure you can trackback to the original.. Are you using wordpress?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Duplicate content issue
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right? Thanks!
Intermediate & Advanced SEO | | jpuzakov0 -
Do search engine consider this duplicate or thin content?
I operate an eCommerce site selling various equipment. We get product descriptions and various info from the manufacturer's websites offered to the dealers. Part of that info is in the form of User Guides and Operational Manuals downloaded in pdf format written by the manufacturer, then uploaded to our site. Also we embed and link to videos that are hosted on the manufacturer's respective YouTube or Vimeo channels. This is useful content for our customers.
Intermediate & Advanced SEO | | MichaelFactor
My questions are: Does this type of content help our site by offering useful info, or does it hurt our SEO due to it being thin and or duplicate content? Or does the original content publishers get all the benefit? Is there any benefit to us publishing this stuff? What exactly is considered "thin content"?0 -
Best anchor text strategy for embeddable content
Hi all We provide online services, and as part of this we provide our clients with a javascript embeddable 'widget' to place on their website. This is fairyly popular (100s-1000s of inserts on websites). The main workings of this are javascript (they spit html iframe onto the page) but we also include both a <noscript>portion (which is purely customer focused, it deep links into a relevant page on our website for the user to follow) and also a plain <p><a href=''></a></p> at the bottom, under the JS. This is all generated and inserted by the website owner. Therefore, after insertion we can dynamically update whatever the Javascript renders out, but the <noscript> and <a> at the bottom are there forever.</p> <p>Previously, this last plain link has been used for optimisation, with it randomly selecting 1 out of a bank of 3 different link anchor texts when the widget html is first generated.</p> <p>We've also recently split our website into B2B and B2C portions, so this will be linking to a newer domain with much established backlinks than the existing domain. I think we could get away with optimised keyword links on the old domain but the newer domain they will be more obvious.</p> <p>In light of recent G updates, we're afraid this may look spammy. We obviously want to utilise the link as best as possible, as it is used by hundreds of our clients, but don't want it to cause any issues. </p> <p>So my question, would you just focus on using brand name anchor text for this? Or could we mix it up with a few keyword optimised links also? If so, what sort of ratio would you suggest?</p> <p>Many thanks</p></noscript>
Intermediate & Advanced SEO | | benseb0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Duplicate Content Question
Currently, we manage a site that generates content from a database based on user search criteria such as location or type of business. ..Although we currently rank well -- we created the website based on providing value to the visitor with options for viewing the content - we are concerned about duplicate content issues and if they would apply. For example, the listing that is pulled up for the user upon one search could have the same content as another search but in a different order. Similar to hotels who offer room booking by room type or by rate. Would this dynamically generated content count as duplicate content? The site has done well, but don't want to risk a any future Google penalties caused by duplicate content. Thanks for your help!
Intermediate & Advanced SEO | | CompucastWeb1 -
Best way to remove duplicate content with categories?
I have duplicate content for all of the products I sell on my website due to categories and subcategories. Ex: http://www.shopgearinc.com/products/product/stockfeeder-af38.php http://www.shopgearinc.com/products/co-matic-power-feeders/stockfeeder-af38.php http://www.shopgearinc.com/products/co-matic-power-feeders/heavy-duty-feeders/stockfeeder-af38.php Above are 3 urls to the same title and content. I use a third party developer backend system so doing canonicalization seems difficult as I don't have full access. What is the best to get rid of this duplicate content. Can I do it through webmaster tools or should I pay the developer to do the canonicalization or a 301 redirect? Any suggestions? Thanks
Intermediate & Advanced SEO | | kysizzle60 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420