Content Strategy/Duplicate Content Issue, rel=canonical question
-
Hi Mozzers:
We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer.
We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places.
What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out.
Does anyone have deeper thinking about this?
-
Basically, your is diluting its own efforts and resources, the idea to publish content in other domain make sense if it applied in the right way. (this not the case).
1- There is no make sense to have the same content in 2 places, so let see what Google has to say about it. Duplicate content
Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the no index meta tag to prevent search engines from indexing their version of the content.
Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.
Let's take an example assume that your client has an article called 3 Tips About legal requirements to buy a home You Can't Afford To Miss ----> The keyword on this case is_ legal requirements to buy a home._
So boost your SEO efforts is to build other articles on other pages around this content, if you made a quick content research about the topic _legal requirements to buy a home. _
Let's take an example assume that your client has an article called 3 Tips About legal requirements to buy a home You Can't Afford To Miss ----> The keyword on this case is legal requirements to buy a home.
- how much do you need to buy a house KD=4 Volume=600
- the best state to buy a house KD=10 Volume=350
- what to do after buying a house KD=4 Volume=350
So the best way to boost your content strategy and your Search Ranking is to create a universe around your content where the center of that universe is your article ( hahaha this just a Marvel joke), Also do not forget to include an internal link strategy and site structure strategy
Regards Hope this info will help you
-
Hi David,
In my limited understanding that won't helpful. Content should published on the blog first then anywhere else
To prevent Google indexing other blogs/website you need to canonicalised the other sites to the Original(your blog), this means that users on the individual sites can find the same information and the correct path but if you search for Google you get to the final page quicker.
Hope this helps.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I put rel next and rel prev and canonical on tags pages
Hi I have a tag pages on a news website each tag page is divided to several pages, but Google does't crawled those pages because the links are in javaScript, I want to do the following things: Change the links to html href Add rel=pref rel=next Add a canonical in each page with the url of the main tag page Do you agree with my solution? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Page / Domain Authority Question
If my website were to purchase a sponsored article on a site with a powerful domain authority that contained a do-follow link, and the link would be "domain.com/articles/new-article" ... obviously new-article would have 0 page authority, being new... is that still considered a valuable link and why or why not?
Intermediate & Advanced SEO | | cat5com0 -
GWT url parameter issue/question
Hi Moz community, I'm having an issue with URL parameters in GWT. The tracking taxonomy for my websites is used as either /?izid=... (internal) OR /?dzid=... (external) I put tracking parameters in GWT as izid & dzid, but it hasn't picked up any URLs or examples in regards to these parameters. It's been about 2 months since we've started using this so I want to make sure Google isn't indexing as duplicate content. Side note: any page that uses a tracking parameter automatically adds rel="canonical" to the original page. Could this be the reason that GWT doesn't pick up any URLs for tracking parameters and/or do I not need to worry about adding paramters if I already have the canonical attribute automatically in place. Thanks for your help,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0 -
Scanning For Duplicate Canonical Tags
I'm looking for a solution for identifying pages on a site that have either empty/undefined canonical tags, or duplicate canonical tags (meaning the tag occurs twice within the same page). I've used Screaming Frog to view sitewide canonical values, but the tool cannot identify when pages use the tag twice, nor can it differentiate between pages that have an empty canonical tag and pages that have no canonical tag at all. Any help finding a tool of some sort that can assist me in doing this would be much appreciated, as I'm working with tens of thousands of pages and can't do this manually.
Intermediate & Advanced SEO | | edmundsseo0 -
How to Remove Joomla Canonical and Duplicate Page Content
I've attempted to follow advice from the Q&A section. Currently on the site www.cherrycreekspine.com, I've edited the .htaccess file to help with 301s - all pages redirect to www.cherrycreekspine.com. Secondly, I'd added the canonical statement in the header of the web pages. I have cut the Duplicate Page Content in half ... now I have a remaining 40 pages to fix up. This is my practice site to try and understand what SEOmoz can do for me. I've looked at some of your videos on Youtube ... I feel like I'm scrambling around to the Q&A and the internet to understand this product. I'm reading the beginners guide.... any other resources would be helpful.
Intermediate & Advanced SEO | | deskstudio0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
Is this duplicate content?
My client has several articles and pages that have 2 different URLs For example: /bc-blazes-construction-trail is the same article as: /article.cfm?intDocID=22572 I was not sure if this was duplicate content or not ... Or if I should be putting "/article.cfm" into the robots.txt file or not.. if anyone could help me out, that would be awesome! Thanks 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560