XML feeds in regards to Duplicate Content
-
Hi everyone
I hope you can help. I run a property portal in Spain and am looking for an answer to an issue we are having.
We are in the process of uploading an XML feed to our site which contains 10,000+ properties relating to our niche.
Although this is great for our customers I am aware this content is going to be duplicated from other sites as our clients advertise over a range of portals.
My question is, are there any measures I can take to safeguard our site from penalisation from Google?
Manually writing up 10,000 + descriptions for properties is out of the question sadly.
I really hope somebody can help
Thanks
Steve
-
Yeah I understand. I think I will take your advice and no follow the pages. It is not worth the risk really.
Thanks for your help
-
I undertand exactly what you are saying but from experience I can tell you that trying to get them indexed would lead a lot of grief. If those listings are all over the net then your website ceases to offer any value to Google and hence it will simply demote you.
I understand it is a tricky situation and it is hard to let go of the idea that these pages should be indexed so you would get rankings for long tail keywords. I guess it is really a question of how much you want to gamble.
-
Hi Mash,
Thanks for your reply. Its such a difficult issue because those pages create deep content which allows us to rank for long tail phrases.
For example, 3 bedroom apartment for sale in Barcelona. So yeah its a tough one, and most of the content on our website would be those property pages. So no indexing them would reduce the amount of content we offer.
What a pickle. Any more help you can provide is appreciated
Thanks
-
Hi Steve,
There is no easy way to prevent duplicate penalty with such a large number of pages being pushed into the site, short of no-indexing those pages.
Perhaps a noindex, follow placed into the head of all these pages would help. I assume you would not want these indexed anyway because that is where the problem will raise it's ugly head!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
International web site - duplicate content?
I am looking at a site offering different language options via a javascript drop down chooser. Will google flag this as duplicate content? Should I recommend the purchase of individual domains for each country? i.e. .uk
White Hat / Black Hat SEO | | bakergraphix_yahoo.com1 -
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Spam report duplicate images
Should i do a spam report if a site competitor as copied my clinical cases images and placed as their own clinical cases. That site also does not have privacy policy or medical doctor on that images. My site: http://www.propdental.es/carillas-de-porcelana/
White Hat / Black Hat SEO | | maestrosonrisas0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Does posting a source to the original content avoid duplicate content risk?
A site I work with allows registered user to post blog posts (longer articles). Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content? Thanks!
White Hat / Black Hat SEO | | 945010 -
Using Programmatic Content
My company has been approached a number of times by computer generated content providers (like Narrative Science and Comtex). They are providing computer generated content to a number of big name websites. Does anyone have any experience working with companies like this? We were burned by the first panda update because we were busing boilerplate forms for content
White Hat / Black Hat SEO | | SuperMikeLewis0 -
Multiple doamin with same content?
I have multiple websites with same content such as http://www.example.com http://www.example.org and so on. My primary url is http://www.infoniagara.com and I also placed a 301 on .org. Is that enough to keep away my exampl.org site from indexing on google and other search engines? the eaxmple.org also has lots of link to my old html pages (now removed). Should i change that links too? or will 301 redirection solve all such issues (page not found/crawl error) of my old webpages? i would welcome good seo practices regarding maintaining multiple domains thanks and regards
White Hat / Black Hat SEO | | VipinLouka780