Syndicated Content Appearing Above Original
-
Hi.
I run a travel blog and my content is often re-posted by related sites with a backlink to my content (and full credit etc) but still ranks above my article in Google.
Any ideas what I can do to stop this happening?
Thanks
-
I think that it's partly because my content is not 'blog-like' it's more like a travel guide and an events guide and some products so it might not look right to people but the drop was dramatic.
By the way, I was just looking at your site. Would you be open to a guest article? I run an Israel travel site and could write about the 'real Israel'! Of course, I can link in exchange etc/whatever you like...
-
wow, that is amazing.
I've been encouraging our authors to get onto G+ to add authorship.
(I have 60 so far)
The benefit is the photos with results, but zero traffic improvement.
I wouldn't blame that on authorship either,. There must be something wrong with my site that I haven't fixed yet, but I don't know what it could be. Except I just finished, about 2 weeks ago, finding and removing the last of the duplicate descriptions and titles.
-
With some people maybe I need to look at that. Problem is that many of my articles are valid only in short term because they're about events (these are the ones people love to use) so it would take a while for a DMCA to set in right?
-
I reversed the authorship and instantly went back to the same level as before though.
-
I would not blame that on authorship. I would blame that on an increased level of piracy. Eventually they can strangle your appearance in the SERPs.
-
I understand. Those weasels do it with my content too.
When they copy I often use DMCA complaints.
-
So annoying! I find that sometimes after a week or two Google corrects itself, sometimes not. I was recommended to install authorship which I did and then found that this problem kind of solved itself, but overall search traffic fell 25%+
...!
-
Welcome to my world, Ben.
There are thousands of sites, that post our headline and a snippet of text - some with a link to us, some not.
It is very frustrating. Google buries our page and promotes those guys. Sometimes, there are several of them, all showing in the results pages and our original page is nowhere to be found, buried in the duplicate content, so if you go to the end of the results and redisplay with all the missing pages, there we are on page 1.
I've been trying to overcome this for 18 months now, but I'm not getting anywhere.
-
The problem is if they dont syndicate they'll modify and copy.
-
It might. It might not.
Content syndication has both Panda and Penguin risks.
And, you have the competitor problem.
-
Hmmm. If it isnt fully identical then Google might display both though right?
-
They will still have a relevant title tag, and relevant content.
(add this to my original reply)....
The best way to get filtered from the search results is to have an article on another site linking to identical article on your site.
-
The interesting thing is that sometimes it's tiny sites with much less authority who are ranking better.
Maybe if I got them to syndicate half the article with a "read more, click here" link that would help?
-
This happens because the other sites that post your content have more authority and that gives them a higher ranking. What can you do to prevent this?
-- only syndicate to sites that have less authority than you
-- create different content for syndication that what appears on your website
-- stop syndicating
This is just one reason why I do not syndicate anything. It creates new competitors and feeds existing competitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to allow bots to crawl all but WP-content
Hello, I would like my website to remain crawlable to bots, but to block my wp content and media. Does the following robots.txt work? I worry that the * user agent may conflict with the others. User-agent: *
Technical SEO | | Tom3_15
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/ User-agent: GoogleBot
Allow: / User-agent: GoogleBot-Mobile
Allow: / User-agent: GoogleBot-Image
Allow: / User-agent: Bingbot
Allow: / User-agent: Slurp
Allow: /0 -
ViewState and Duplicate Content
Our site keeps getting duplicated content flagged as an issue... however, the pages being grouped together have very little in common on-page. One area which does seem to recur across them is the ViewState. There's a minimum of 150 lines across the ones we've investigated. Could this be causing the reports?
Technical SEO | | RobLev0 -
Devaluing certain content to push better content forward
Hi all, I'm new to Moz, but hoping to learn a lot from it in hopes of growing my business. I have a pretty specific question and hope to get some feedback on how to proceed with some changes to my website. First off, I'm a landscape and travel photographer. My website is at http://www.mickeyshannon.com - you can see that the navigation quickly spreads out to different photo galleries based on location. So if a user was looking for photos from California, they would find galleries for Lake Tahoe, Big Sur, the Redwoods and San Francisco. At this point, there are probably 600-800 photos on my website. At last half of these are either older or just not quite up to par with the quality I'm starting to feel like I should produce. I've been contemplating dumbing down the galleries, and not having it break down so far. So instead of four sub-galleries of California, there would just be one California gallery. In some cases, where there are lots of good images in a location, I would probably keep the sub-galleries, but only if there were dozens of images to work with. In the description of each photo, the exact location is already mentioned, so I'm not sure there's a huge need for these sub-galleries except where there's still tons of good photos to work with. I've been contemplating building a sort of search archive. Where the best of my photos would live in the main galleries, and if a user didn't find what they were looking for, they could go and search the archives for older photos. That way they're still around for licensing purposes, etc. while the best of the best are pushed to the front for those buying fine art prints, etc. These pages for these search archives would probably need to be de-valued somehow, so that the main galleries would be more important SEO-wise. So for the California galleries, four sub-galleries of perhaps 10 images each would become one main California gallery with perhaps 15 images. The other 25 images would be thrown in the search archive and could be searched by keyword. The question I have - does this sound like a good plan, or will I really be killing my site when it comes to SEO by making such a large change? My end goal would be to push my better content to the front, while scaling back a lot of the excess. Hopefully I explained this question well. If not, I can try to elaborate further! Thanks, Mickey
Technical SEO | | msphotography0 -
How do I avoid this issue of duplicate content with Google?
I have an ecommerce website which sells a product that has many different variations based on a vehicle’s make, model, and year. Currently, we sell this product on one page “www.cargoliner.com/products.php?did=10001” and we show a modal to sort through each make, model, and year. This is important because based on the make, model, and year, we have different prices/configurations for each. For example, for the Jeep Wrangler and Jeep Cherokee, we might have different products: Ultimate Pet Liner - Jeep Wrangler 2011-2013 - $350 Ultimate Pet Liner - Jeep Wrangler 2014 - 2015 - $350 Utlimate Pet Liner - Jeep Cherokee 2011-2015 - $400 Although the typical consumer might think we have 1 product (the Ultimate Pet Liner), we look at these as many different types of products, each with a different configuration and different variants. We do NOT have unique content for each make, model, and year. We have the same content and images for each. When the customer selects their make, model, and year, we just search and replace the text to make it look like the make, model, and year. For example, when a custom selects 2015 Jeep Wrangler from the modal, we do a search and replace so the page will have the same url (www.cargoliner.com/products.php?did=10001) but the product title will say “2015 Jeep Wrangler”. Here’s my problem: We want all of these individual products to have their own unique urls (cargoliner.com/products/2015-jeep-wrangler) so we can reference them in emails to customers and ideally we start creating unique content for them. Our only problem is that there will be hundreds of them and they don’t have unique content other than us switching in the product title and change of variants. Also, we don’t want our url www.cargoliner.com/products.php?did=10001 to lose its link juice. Here’s my question(s): My assumption is that I should just keep my url: www.cargoliner.com/products.php?did=10001 and be able to sort through the products on that page. Then I should go ahead and make individual urls for each of these products (i.e. cargoliner.com/products/2015-jeep-wrangler) but just add a “nofollow noindex” to the page. Is this what I should do? How secure is a “no-follow noindex” on a webpage? Does Google still index? Am I at risk for duplicate content penalties? Thanks!
Technical SEO | | kirbyfike0 -
Is the content on my website is garbage?
I received a mail from google webmasters, that my website is having low quality content. Website - nowwhatmoments.com
Technical SEO | | Green.landon0 -
Premium Content
Hey Guys I woking on a site that publishes hundreds of new content a day and part of the content is only available for users for 30 days. After 30 days the content is only accessible to premium users.
Technical SEO | | Mr.bfz
After 30 days, the page removes the content and replaces it with a log in/ sign up option. The same URL is kept for each page and the title of the article.
I have 2 concerns about this method. Is it healthy for the site to be removing tons of content of live pages and replace with a log in options Should I worry about Panda for creating tons of pages with unique URL but very similar source /content - the log in module and the text explaining that it is only available to premium users. The site is pretty big so google has some tolerance of things we can get away with it. Should I add a noindex attribute for those pages after 30 days? Even though it can takes months until google actually removes from the index. Is there a proper way for performing this type of feature in sites with a log in option after a period of time (first click free is not an option) Thanks Guys and I appreciate any help!0 -
Mirrored content/ images
We are currently in the process of creating a new website in place of our old site (same URL etc.) We've recently created another website which has the same design/ layout/ pictures and general site architecture as our new site will have. If I was to add alt test to images only on one site would we still be penalised by Google as the sites 'look' the same, event thought they will have completely different URL's and different focusses on a similar topic. Content will be different also, but both sites will focus on a similar subject. Thanks
Technical SEO | | onlinechester0 -
Caps in URL creating duplicate content
Im getting a bunch of duplicate content errors where the crawl is saying www.url.com/abc has duplicate at www.url.com/ABC The content is in magento and the url settings are lowercase, and I cant figure out why it thinks there is duplicate consent. These are pages with a decent number of inbound links.
Technical SEO | | JohnBerger0