Dublicated content
-
I have someone to write new pages for my site.
How do I know the pages she is writing is not duplicated from other other website.
is there any website or software to do this?
What is the best way to check?
Thank you
-
As billy said Copyscape is a great option but you can also take sections of the text put it in Google see what comes up. If you can't trust your copy writer they may not be for you then. You can always ask to see previous work when getting some one new and look to see if its also duplicated etc.
-
copyscape.com You will have to do each page individually but it works well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
Duplicate content - multiple sites hosted on same server with same IP address
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
White Hat / Black Hat SEO | | Jade0 -
Is this Duplicate content?
Hi all, This is now popping up in Moz after using this for over 6 months.
White Hat / Black Hat SEO | | TomLondon
It is saying this is now duplicate site content. What do we think? Is this a bad strategy, it works well on the SERPS but could be damaging the root domain page ranking? I guess this is a little shady. http://www.tomlondonmagic.com/area/close-up-magician-in-crowborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-desborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-didcot/ Thanks.0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Syndicated content outperforming our hard work!
Our company (FindMyAccident) is an accident news site. Our goal is to roll our reporting out to all 50 states; currently, we operate full-time in 7 states. To date, the largest expenditure is our writing staff. We hire professional
White Hat / Black Hat SEO | | Wayne76
journalists who work with police departments and other sources to develop written
content and video for our site. Our visitors also contribute stories and/or
tips that add to the content on our domain. In short, our content/media is 100% original. A site that often appears alongside us in the SERPs in the markets where we work full-time is accidentin.com. They are a site that syndicates accident news and offers little original content. (They also allow users to submit their own accident stories, and the entries index quickly and are sometimes viewed by hundreds of people in the same day. What's perplexing is that these entries are isolated incidents that have little to no media value, yet they do extremely well.) (I don't rest my bets with Quantcast figures, but accidentin does use their pixel sourcing and the figures indicate that they are receiving up to 80k visitors a day in some instances.) I understand that it's common to see news sites syndicate from the AP, etc., and traffic accident news is not going to have a lot of competition (in most instances), but the real shocker is that accidentin will sometimes appear as the first or second result above the original sources??? The question: does anyone have a guess as to what is making it perform so well? Are they bound to fade away? While looking at their model, I'm wondering if we're not silly to syndicate news in the states where we don't have actual staff? It would seem we could attract more traffic by setting up syndication in our vacant states. OR Is our competitor's site bound to fade away? Thanks, gang, hope all of you have a great 2013! Wayne0 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
Content box (on page content) and titles Google over-optimization penalty?
We have a content box at the bottom of our website with a scroll bar and have posted a fair bit of content into this area (too much for on page) granted it is a combination of SEO content (with links to our pages) and informative but with the over optimization penalty coming around I am a little scared if this will result in a problem for us. I am thinking of adopting the process of this website HERE with the content behind a more information button that drops down, would this be better as it could be much more organised and we will be swopping out to more helpful information than the current 50/50 (SEO – helpful content) or will it be viewed the same and we might as well leave it as is and lower the amount of repetition and links in the content. Also we sell printed goods so our titles may be a bit over the top but they are bring us a lot of converting traffic but again I am worried about the new Google release this is an example of a typical title (only an example not our product page) Banner Printing | PVC Banners | Outdoor Banners | Backdrops | Vinyl Banners | Banner Signs Thank you for any help with these matters.
White Hat / Black Hat SEO | | BobAnderson0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0