Syndicated Content Appearing Above Original
-
Hi.
I run a travel blog and my content is often re-posted by related sites with a backlink to my content (and full credit etc) but still ranks above my article in Google.
Any ideas what I can do to stop this happening?
Thanks
-
I think that it's partly because my content is not 'blog-like' it's more like a travel guide and an events guide and some products so it might not look right to people but the drop was dramatic.
By the way, I was just looking at your site. Would you be open to a guest article? I run an Israel travel site and could write about the 'real Israel'! Of course, I can link in exchange etc/whatever you like...
-
wow, that is amazing.
I've been encouraging our authors to get onto G+ to add authorship.
(I have 60 so far)
The benefit is the photos with results, but zero traffic improvement.
I wouldn't blame that on authorship either,. There must be something wrong with my site that I haven't fixed yet, but I don't know what it could be. Except I just finished, about 2 weeks ago, finding and removing the last of the duplicate descriptions and titles.
-
With some people maybe I need to look at that. Problem is that many of my articles are valid only in short term because they're about events (these are the ones people love to use) so it would take a while for a DMCA to set in right?
-
I reversed the authorship and instantly went back to the same level as before though.
-
I would not blame that on authorship. I would blame that on an increased level of piracy. Eventually they can strangle your appearance in the SERPs.
-
I understand. Those weasels do it with my content too.
When they copy I often use DMCA complaints.
-
So annoying! I find that sometimes after a week or two Google corrects itself, sometimes not. I was recommended to install authorship which I did and then found that this problem kind of solved itself, but overall search traffic fell 25%+
...!
-
Welcome to my world, Ben.
There are thousands of sites, that post our headline and a snippet of text - some with a link to us, some not.
It is very frustrating. Google buries our page and promotes those guys. Sometimes, there are several of them, all showing in the results pages and our original page is nowhere to be found, buried in the duplicate content, so if you go to the end of the results and redisplay with all the missing pages, there we are on page 1.
I've been trying to overcome this for 18 months now, but I'm not getting anywhere.
-
The problem is if they dont syndicate they'll modify and copy.
-
It might. It might not.
Content syndication has both Panda and Penguin risks.
And, you have the competitor problem.
-
Hmmm. If it isnt fully identical then Google might display both though right?
-
They will still have a relevant title tag, and relevant content.
(add this to my original reply)....
The best way to get filtered from the search results is to have an article on another site linking to identical article on your site.
-
The interesting thing is that sometimes it's tiny sites with much less authority who are ranking better.
Maybe if I got them to syndicate half the article with a "read more, click here" link that would help?
-
This happens because the other sites that post your content have more authority and that gives them a higher ranking. What can you do to prevent this?
-- only syndicate to sites that have less authority than you
-- create different content for syndication that what appears on your website
-- stop syndicating
This is just one reason why I do not syndicate anything. It creates new competitors and feeds existing competitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Brainstorming
Hi, New here in the SEO world. Excellent resources here. We have an ecommerce website that sells presentation templates. Today our templates come in 3 flavours - for PowerPoint, for Keynote and both - called Presentation Templates. So we've ended up with 3 URLS with similar content. Same screenshots, similar description.. Example: https://www.improvepresentation.com/keynote-templates/social-media-keynote-template https://www.improvepresentation.com/powerpoint-templates/social-media-powerpoint-template https://www.improvepresentation.com/presentation-templates/social-media-presentation-template I know what you're thinking. Why not make a website with a template and give 3 download options right? But what about https://www.improvepresentation.com/powerpoint-templates/ https://www.improvepresentation.com/keynote-templates/ These are powerfull URL's in my opinion taking into account that the strongest keyword in our field is "powerpoint templates" How would you solve this "problem" or maybe there is no problem at all.
Technical SEO | | slidescamp0 -
Will syndicated content hurt a website's ranking potential?
I work with a number of independent insurance agencies across the United States. All of these agencies have setup their websites through one preferred insurance provider. The websites are customizable to a point, but the content for the entire website is mostly the same. Therefore, literally hundreds of agency sites have essentially the same content. The only thing that changes is a few "wildcards" in the copy where the agency fills in their city, state, services areas, company history, etc. My questions is: will this syndicated content hurt their ranking potential? I've been toying with the idea of further editing the content to make it more unique to an agency, but I would hate to waste a lot of hours doing this if it won't help anything. Would you expect this approach to be beneficial or a waste of time? Thank you for your help!
Technical SEO | | copyjack0 -
Problem with duplicate content
Hi, My problem is this: SEOmoz tells me I have duplicate content because it is picking up my index page in three different ways: http://www.web-writer-articles.co.uk http://www.web-writer-articles.co.uk/ and http://www.web-writer-articles.co.uk/index.php Can someone give me some advice as to how I can deal with this issue? thank you for your time, louandel15
Technical SEO | | louandel150 -
Blocking AJAX Content from being crawled
Our website has some pages with content shared from a third party provider and we use AJAX as our implementation. We dont want Google to crawl the third party's content but we do want them to crawl and index the rest of the web page. However, In light of Google's recent announcement about more effectively indexing google, I have some concern that we are at risk for that content to be indexed. I have thought about x-robots but have concern about implementing it on the pages because of a potential risk in Google not indexing the whole page. These pages get significant traffic for the website, and I cant risk. Thanks, Phil
Technical SEO | | AU-SEO0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240 -
Noindex all dodgy content?
Hello should I be brutal with noindex? should I noindex anything of no value to websurfers? from my understanding, nofollow is different to to noindex? Google follows through the site crawling and discovering subpages but will not put the noindexed page in serps. Is that right? I have subcategory pages in a business directory site, these pages just have links to there subpages.
Technical SEO | | adamzski1 -
Duplicate content issues caused by our CMS
Hello fellow mozzers, Our in-house CMS - which is usually good for SEO purposes as it allows all the control over directories, filenames, browser titles etc that prevent unwieldy / meaningless URLs and generic title tags - seems to have got itself into a bit of a tiz when it comes to one of our clients. We have tried solving the problem to no avail, so I thought I'd throw it open and see if anyone has a soultion, or whether it's just a fault in our CMS. Basically, the SEs are indexing two identical pages, one ending with a / and the other ending /index.php, for one of our sites (www.signature-care-homes.co.uk). We have gone through the site and made sure the links all point to just one of these, and have done the same for off-site links, but there is still the duplicate content issue of both versions getting indexed. We also set up an htaccess file to redirect to the chosen version, but to no avail, and we're not sure canonical will work for this issue as / pages should redirect to /index.php anyway - and that's we can't work out. We have set the access file to point to index.php, and that should be what should be happening anyway, but it isn't. Is there an alternative way of telling the SE's to only look at one of these two versions? Also, we are currently rewriting the content and changing the structure - will this change the situation we find ourselves in?
Technical SEO | | themegroup0