I'm worried my client is asking me to post duplicate content, am I just being paranoid?
-
Hi SEOMozzers,
I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries.
My client believes Google might like us a bit more if we had more "text" content.
So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media).
My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent.
I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content.
Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid?
Thanks everyone. This is my first post to the Moz community
-
I work with a lot of sites that have been affected by Panda and the type of thing that you are talking about doing is exactly the type of thing that has gotten most of these sites flagged by Panda.
You're client is right that it is a good idea to have text next on the pages. But, if the text is not unique then what Google does is say, "This page is essentially the same as one that is already in our index. There's no reason showing two identical pages to searchers so we won't show this one." If enough of your pages are duplicates then the whole site (including original pages) can be flagged by Panda.
-
Very helpful- I'm moving forward with this advice!
-
As an additional tip, you can use a service like Copyscape to verify whether or not the content has been posted elsewhere online.
-
Definitely sounds scalable for this site. Taking this type of shortcut with scraped content won't work. I would call it just that when you talk to the client, it's a "shortcut using scraped content" that Google has caught onto and suppressed. If the client is skeptical show him a link to the official Google forum where they talk against this.
Rewriting the content is easy and provides little hand holding, just make sure the person doing the writing has good writing skills and has English as a first language or it will read funky and at the end of the day you are creating content for the user. This is also the perfect opportunity to get a few instances of your keyword phrase into the content where it probably wasn't there before in the copied content!
-
Hi Steven,
Welcome to this community. The ideal response to your question would be to take the content that the client is providing and come up with unique content based on that material. So essentially rewriting those content pieces and giving your own flavor to them. Now , of course, due to various reasons that might not be possible (time, budget, resources). In that case it's best to give credit to the original source where you got the content from, when you add it to the site. More info in the links below:
-
Thanks Irving. It's only for 8 pages right now- but my client plans on posting more destinations (and thus more not-so-unique-content) in the future.
Re-writing is something I hadn't considered. That may be a more cost-efficient idea. Thanks for the idea!
-
Welcome aboard!
Content needs to be unique especially if you want to rank.
How many pages are we talking about, I would suggest you get the content re-written by someone if it's not a ton of pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What does Google's Spammy Structured Markup Penalty consist of?
Hey everybody,
White Hat / Black Hat SEO | | klaver
I'm confused about the Spammy Structured Markup Penalty: "This site may not perform as well in Google results because it appears to be in violation of Google's Webmaster Guidelines." Does this mean the rich elements are simply removed from the snippets? Or will there be an actual drop in rankings? Can someone here tell from experience? Thanks for your help!1 -
Scraping Website and Using Our Clients Info
One of our clients on Moz has noticed that another website has been scraping their website and pulling lots of their content without permission. We would like to notify Google about this company but are not sure if that is the right remedy to correct the problem. They appear in search results on Google using the client's name so they seem to be use page titles etc with the client's name in them. Several of the SERP links link to their own website but it pulls in our client's web page. Was hoping anyone could perhaps provide some additional options on how to attack this problem?
White Hat / Black Hat SEO | | InTouchMK0 -
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Spamming Backlinks - doesn't seem to be detrimental enough?
Hi, We have noticed sites such as http://www.rattanfurnitureoutlet.co.uk & http://www.supremerattanfurniture.co.uk/ have huge numbers of what appear to be spam like and invaluable backlinks yet they have both maintained a great ranking for their search terms despite this. We would like to know why, does the good outweigh the bad so to speak?
White Hat / Black Hat SEO | | birdmarketing0 -
Lots of websites copied my original content from my own website, what should I do?
1. Should I ask them to remove and replace the content with their unique and original content? 2. Should I ask them to link to the URL where the original content is located? 3. Should I use a tool to easily track these "copycat" sites and automatically add links from their site to my site? Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Will implementing 301's on an existing domain impact massively on rankings?
Hi Guys,I have a new SEO client who only has the non-www domain setup for GWT and I am wondering if implementing a 301 for www will have a massive negative impact on rankings. I know a percentage of link juice and PageRank will be affected. So my question is: If I implement the 301 should I brace myself for a fall in rankings. Should I use a 301 instead to maintain link juice and PageRank? Is it good practice to forward to www? Or could I leave the non www in place and have the www redirect to it to maintain the data? Dave
White Hat / Black Hat SEO | | icanseeu0 -
Possibly a dumb question - 301 from a banned domain to new domain with NEW content
I was wondering if banned domains pass any page rank, link love, etc. My domain got banned and I AM working to get it unbanned, but in the mean time, would buying a new domain, and creating NEW content that DOES adhere to the google quality guidelines, help at all? Would this force an 'auto-evaluation' or 're-evaluation' of the site by google? or would the new domain simply have ZERO effect from the 301 unless that old domain got into google's good graces again.
White Hat / Black Hat SEO | | ilyaelbert0 -
User comments with page content or as a separate page?
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
White Hat / Black Hat SEO | | Peter2640