Syndicated Content Appearing Above Original
-
Hi.
I run a travel blog and my content is often re-posted by related sites with a backlink to my content (and full credit etc) but still ranks above my article in Google.
Any ideas what I can do to stop this happening?
Thanks
-
I think that it's partly because my content is not 'blog-like' it's more like a travel guide and an events guide and some products so it might not look right to people but the drop was dramatic.
By the way, I was just looking at your site. Would you be open to a guest article? I run an Israel travel site and could write about the 'real Israel'! Of course, I can link in exchange etc/whatever you like...
-
wow, that is amazing.
I've been encouraging our authors to get onto G+ to add authorship.
(I have 60 so far)
The benefit is the photos with results, but zero traffic improvement.
I wouldn't blame that on authorship either,. There must be something wrong with my site that I haven't fixed yet, but I don't know what it could be. Except I just finished, about 2 weeks ago, finding and removing the last of the duplicate descriptions and titles.
-
With some people maybe I need to look at that. Problem is that many of my articles are valid only in short term because they're about events (these are the ones people love to use) so it would take a while for a DMCA to set in right?
-
I reversed the authorship and instantly went back to the same level as before though.
-
I would not blame that on authorship. I would blame that on an increased level of piracy. Eventually they can strangle your appearance in the SERPs.
-
I understand. Those weasels do it with my content too.
When they copy I often use DMCA complaints.
-
So annoying! I find that sometimes after a week or two Google corrects itself, sometimes not. I was recommended to install authorship which I did and then found that this problem kind of solved itself, but overall search traffic fell 25%+
...!
-
Welcome to my world, Ben.
There are thousands of sites, that post our headline and a snippet of text - some with a link to us, some not.
It is very frustrating. Google buries our page and promotes those guys. Sometimes, there are several of them, all showing in the results pages and our original page is nowhere to be found, buried in the duplicate content, so if you go to the end of the results and redisplay with all the missing pages, there we are on page 1.
I've been trying to overcome this for 18 months now, but I'm not getting anywhere.
-
The problem is if they dont syndicate they'll modify and copy.
-
It might. It might not.
Content syndication has both Panda and Penguin risks.
And, you have the competitor problem.
-
Hmmm. If it isnt fully identical then Google might display both though right?
-
They will still have a relevant title tag, and relevant content.
(add this to my original reply)....
The best way to get filtered from the search results is to have an article on another site linking to identical article on your site.
-
The interesting thing is that sometimes it's tiny sites with much less authority who are ranking better.
Maybe if I got them to syndicate half the article with a "read more, click here" link that would help?
-
This happens because the other sites that post your content have more authority and that gives them a higher ranking. What can you do to prevent this?
-- only syndicate to sites that have less authority than you
-- create different content for syndication that what appears on your website
-- stop syndicating
This is just one reason why I do not syndicate anything. It creates new competitors and feeds existing competitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO for a a static content website
Hi everyone, We would like to ask suggestions on how to improve our SEO for our static content help website. With the release of each new version, our company releases a new "help" page, which is created by an authoring system. This is the latest page: http://kilgray.com/memoq/2015/help-en/ I have a couple of questions: 1- The page has an index with many links that open up new subpages with content for users. It is impossible to add title tags to this subpages, as everything is held together by the mother page. So it is really hard to for users to find these subpage information when they are doing a google search. 2- We have previous "help" pages which usually rank better in google search. They also have the same structure (1 page with big index and many subpages) and no metadata. We obviously want the last version to rank better, however, we are afraid exclude them from bots search because the new version is not easy to find. These are some of the previous pages: http://kilgray.com/memoq/2014R2/help-en/ http://kilgray.com/memoq/62/help-en/ I would really appreciate suggestions! Thanks
Technical SEO | | Kilgray0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
303 redirect for geographically targeted content
Any insight as to why Yelp does a 303 redirect, when everyone else seems to be using a 302? Does a 303 pass PR? Is a 303 preferred?
Technical SEO | | jcgoodrich0 -
404 or 503 Malware Content ?
Hi Folks When it comes to malware , if I have a site that uses iframe to show content off 3rd party sites which at times gets infected. Would you recommend 404 or 503 ing those pages with the iframe till the issue is resolved ? ( I am inclined to use 503 .. ) Then take the 404/503 off and ask for a reindex ( from GWT malware section ) OR Ask for a reindex as soon as the 404/503 goes up. ( I do understand we are asking to index as non existing page , but the malware warning gets removed ) PS : it makes sense for this business to showcase content using iframe on these special pages . I do understand these are not the best way to go about SEO.
Technical SEO | | Saijo.George0 -
An odd duplicate content issue...
Hi all, my developers have just assured me that nothing has changed form last week but in the today's crawl I see all the website duplicated: and the difference on the url is the '/' so basically the duplicated urls are: htts://blabla.bla/crop htts://blabla.bla/crop/ Any help in understanding why is much appreciated. thanks
Technical SEO | | LeadGenerator0 -
Squarespace Duplicate Content Issues
My site is built through squarespace and when I ran the campaign in SEOmoz...its come up with all these errors saying duplicate content and duplicate page title for my blog portion. I've heard that canonical tags help with this but with squarespace its hard to add code to page level...only site wide is possible. Was curious if there's someone experienced in squarespace and SEO out there that can give some suggestions on how to resolve this problem? thanks
Technical SEO | | cmjolley0 -
Canonical usage and duplicate content
Hi We have a lot of pages about areas like ie. "Mallorca" (domain.com/Spain/Mallorca), with tabbed pages like "excursion" (domain.com/spain/Mallorca/excursions) and "car rental" (domain.com/Spain/Mallorca/car-rental) etc. The text on ie the "car rental"-page is very similar on Mallorca and Rhodos, and seomoz marks these as duplicate content. This happens on "car rental", "map", "weather" etc. which not have a lot of text but images and google maps inserted. Could i use rel=nex/prev/canonical to gather the information from the tabbed pages? That could show google that the Rhodos-map page is related to Rhodos and not Mallorca. Is that all wrong or/and is there a better way to do this? Thanks, Alsvik
Technical SEO | | alsvik0 -
API for testing duplicate content
Does anyone know a service or API or php lib to compare two (or more) pages and to return their similiarity (Level-3-Shingles). API would be greatly prefered.
Technical SEO | | Sebes0