RSS Feed - Dupe Content?
-
OK so yesterday a website agreed to publish my RSS feed and I just wanted to check something.
The site in question is far more established than mine and I am worrying that with my content appearing on their website pretty much at the same time as mine, will Google index theirs first and therefore consider mine to be dupe?
They are linking back to each of my articles with the text "original post" and I'm not sure whether this will help.
Thanks in advance for any responses!
-
I'd personally be worried about this. I'd do this:
- Publish a few articles just on your site
- Note how long it takes Google to find them
- If this is a long time (ie more than 24 hours), maybe do a few social bookmarks - drop the link on Stumbleupon, Twitter, Facebook etc when you publish them. If using wordpress, you can get it to do this automatically.
- Now see what the time is for Google to index your pages
- Set up your RSS feed so it updates 24 hours or 48 hours after your site content is added, whatever it takes to get the post indexed on your site before their site.
- You could even control this individual feed's delay using http://wordpress.stackexchange.com/questions/1397/delaying-one-rss-feed-in-wordpress-but-not-the-others
- This way, you haven't got to get them to do anything at their end
-
Personally I don’t think so… but I guess to be on the safe side you can always ask him to add a canonical tag on your RSS feed and the issue will be resolved forever… btw I have seen many websites which copy others content and this thing usually don’t arises…
In my opinion Google is good at picking up this issue.
-
Yes, you are courting controversy here. There is no need to feature the same content in other external website. If you can manage to convince the website owners to place a rel syndicate tag on every article that gets published there [pointing to the original article], you would not have to fear anything.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
Hide mobile content element
Hello, We are optimizing our mobile (responsive) website at this moment and we want to change some elements on the productpage of our webshop in order of rank. This is the outcome of a data analysis which proves a change in rank of order will result in a higher conversion rate. No doubt. By technical limitations there is no other option but duplicating the element 'product description' in the source of the page to be able to show in on the spot on the product page we like to. Because our webshop is responsive we can not just move the element to the right spot for mobile because the tablet and desktop version then will change as well. My question is: will it be a problem for Google if we hide the original element of the product description on mobile pages by using the bootstrap class "hidden-xs" and duplicate it on another spot in the page to show it with the "visible-xs" class? My concern is that this will create duplicate content and hiding content for Google is not particularly good. On the other hand, I think Google is smart enough to understand that this is not to manipulate visitors or rankings, but this is only for a different look of the mobile website. I hope you guys can give me some good advice.
Technical SEO | | MarcelMoz
Thanks in advance.
Marcel0 -
Duplicate content through product variants
Hi, Before you shout at me for not searching - I did and there are indeed lots of threads and articles on this problem. I therefore realise that this problem is not exactly new or unique. The situation: I am dealing with a website that has 1 to N (n being between 1 and 6 so far) variants of a product. There are no dropdown for variants. This is not technically possible short of a complete redesign which is not on the table right now. The product variants are also not linked to each other but share about 99% of content (obvious problem here). In the "search all" they show up individually. Each product-variant is a different page, unconnected in backend as well as frontend. The system is quite limited in what can be added and entered - I may have some opportunity to influence on smaller things such as enabling canonicals. In my opinion, the optimal choice would be to retain one page for each product, the base variant, and then add dropdowns to select extras/other variants. As that is not possible, I feel that the best solution is to canonicalise all versions to one version (either base variant or best-selling product?) and to offer customers a list at each product giving him a direct path to the other variants of the product. I'd be thankful for opinions, advice or showing completely new approaches I have not even thought of! Kind Regards, Nico
Technical SEO | | netzkern_AG0 -
Do rss feeds help seo?
If we put relevant RSS feeds on a site, will it help the SEO value? Years ago, I shied away from RSS feeds because they slowed the site down and I didn't like relying on them. However, the past couple years, the Internet has become better, especially in Alaska.
Technical SEO | | manintights280 -
Duplicate page content
hi I am getting an duplicate content error in SEOMoz on one of my websites it shows http://www.exampledomain.co.uk http://www.exampledomain.co.uk/ http://www.exampledomain.co.uk/index.html how can i fix this? thanks darren
Technical SEO | | Bristolweb0 -
Copying my content
Hi there, I run a successful e-commerce website, which the product pages are rich with content linking to other products etc, one of our retailers who sell our products I just noticed copied and pasted the content I have written for these product pages leaving in all the links, which it turn are linking back to my product pages, is this a good thing? or should I make that retailer put in canonical tags? Thanks for any help
Technical SEO | | Paul780 -
Duplicate content domains ranking successfully
I have a project with 8 domains and each domain is showing the same content (including site structure) and still all sites do rank. When I search for a specific word-string in google it lists me all 8 domains. Do you have an explanation, why Google doesn't filter those URLs to just one URL instead of 8 with the same content?
Technical SEO | | kenbrother0