Site been plagiarised - duplicate content
-
Hi,
I look after two websites, one sells commercial mortgages the other sells residential mortgages.
We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right.
I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word.
I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well.
I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory?
I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view.
Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario!
In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy.
Any advice would be greatly appreciated.
Thanks,
Amelia
-
Hi David,
I hope you had a good weekend?
Thank you for all your help! I reported them to Google using the link you posted and already the other site's URLs that had copied us have been removed and our pages have been put back in the index.
I have to say I am absolutely astounded that Google responded so quickly!
Yes, that is us on Google + and my personal Google + is here: https://plus.google.com/u/0/+AmeliaVargo/posts/.
Thank you again for your help thus far, and for your kind offer of more help should we need it!
Have a great day,
Amelia
-
Glad I could help. I really hope you get this all sorted out. Good news is, you found the problem and are working to fix it, which is much better than most people would have been able to do. Have high hopes!
"the two pages they've copied are really important sales pages (remortgage and first time buyer) so for us, it's a massive shame. "
There is still a way to promote those pages, just not using Google organic to do so. Modify some of the content, create a press release, promote that page using social networks, and drive interest to that page and your site the old fashioned way. PPC is always an option as well. Remember, there are many ways to get traffic, don't lose hope or the vision.
On a side note, is this your company?
https://plus.google.com/u/0/+TurnkeymortgagesCoUk/postsI can add you to my circles, so if you have any more issues or need additional help just let me know.
-
I just wanted to post up a message to everyone who has helped me with this problem.
First of all, please accept my sincere thanks. I REALLY appreciate everyone's contribution.
Now, I just wanted to tell you all what, as a company, we've decided to do.
- We've written letters to: The company that copied us, their web designer and their host, asking them to remove the copied content within 14 days of the letters.
- We've 'reported' them to Google, via one of the links that David posted (https://support.google.com/legal/troubleshooter/1114905?hl=en)
- We've reported them for scraping, using the link that Paddy posted
Hopefully, this problem will go away, but I hate to think how much business we may have lost as a result - the two pages they've copied are really important sales pages (remortgage and first time buyer) so for us, it's a massive shame.
Best wishes, and I hope you all have a great weekend!
Amelia
-
Thank you David.
-
Once their version is removed/rewritten, resubmit your site to Google in every way that you can.
1. Fetch as Google
2. Change sitemap created dates to current day
3. Change crawl frequency in sitemap to daily
4. Check for proper 301 redirects from old pages, when you moved/modified the site to separate branding.
5. Submit the URL in question to Google, and letting them know that someone has copied your site's content. They should be able to see that your was created first.Here are a few links to help:
https://www.google.com/webmasters/tools/dmca-notice <<< start there
https://support.google.com/legal/troubleshooter/1114905?hl=en
http://blog.kissmetrics.com/find-remove-stolen-content/
http://www.orclage.com/report-remove-stolen-duplicate-content-google/
-
Thank you Paddy! Much appreciated, and thank you for helping me again!
-
Ahh, good one.
-
Don't forget about this:
https://docs.google.com/forms/d/1Pw1KVOVRyr4a7ezj_6SHghnX1Y6bp1SOVmy60QjkF0Y/viewform
-
Thank you, you've helped me no end.
Have a great weekend
-
It really depends on the web host whether they will follow it or not. Some that are soley based in the UK might not. If they have US based servers or the site is hosted in the US more than likely they will. It is worth a shot though, I try to rattle as many cages as possible. Here is a little info on filing them in the UK https://www.teneric.co.uk/marketing/copyright-infringement.html
-
Hi Lesley,
Yes, I redirected everything using 301 redirects - page to page. I also used the change of address tool in webmaster tools for the site that changed domains.
I don't know if using DMCA will be appropriate - isn't that a US-only thing or can site owners in the UK use it too? If I can, I will use it.
Thank you for responding - I really do appreciate your help.
Best wishes,
Amelia
-
After they drop out of the searches google will index your site as a the canonical site with that content on it. So that part happens manually. Also, when you relaunched, did you redirect everything from the old site? That helps preserve link juice and at the same time gives search engines a pointer that the address of a page has changed to this new address.
One thing I would suggest is having a DMCA take down notice draft and sent to the host as well. If the other people you send letters to tell you to go pound sand, normally the host does not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content multi language / regional websites
Hi Guys, I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://moz.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research. The case: A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries. They are willing to implement the following changes: - Href lang tags - Possible a Local Phone number - Possible a Local translation of the menu - Language meta tag (for Bing) Optional they are willing to take the following steps: - Crosslinking every page though a language flag or similar navigation in the header. - Invest in gaining local .be backlinks - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant). The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands? Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website. Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example. I would love to hear from you guys. Best regards, Bob van Biezen
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Is This Considered Duplicate Content?
My site has entered SEO hell and I am not sure how to fix it. Up until 18 months ago I had tremendous success on Google and Bing and now my website appears below my Facebook page for the term "Direct Mail Raleigh." What makes it even more frustrating is my competitors have done no SEO and they are dominating this keyword. I thought that the issue was due to harmful inbound links and two months ago I disavowed ones that were clearly spam. Somehow my site has actually gone down! I have a blog that I have updated infrequently and I do not know if it I am getting punished for duplicate content. On Google Webmaster Tools it says I have 279 crawled and indexed pages. Yesterday when I ran the MOZ crawl check I was amazed to find 1150 different webpages on my site. Despite the fact that it does not appear on the webmaster tools I have three different webpages due to the format that the Wordpress blog was created: "http://www.marketplace-solutions.com/report/part2leadershi/", "http://www.marketplace-solutions.com/report/page/91/" and "http://www.marketplace-solutions.com/report/category/competent-leadership/page/3/" What does not make sense to me is why Google only indexed 279 webpages AND why MOZ did not identify these three webpages as duplicate content with the Crawl Test Tool. Does anyone have any ideas? Would it be as easy as creating a massive robot.txt file and just putting 2 of the 3 URLs in that file? Thank you for your help.
Intermediate & Advanced SEO | | DR700950 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Dealing with close content - duplicate issue for closed products
Hello I'm dealing with some issues. Moz analyses is telling me that I have duplicate on some of my products pages. My issue is that: Concern very similar products IT products are from the same range Just the name and pdf are different Do you think I should use canonical url ? Or it will be better to rewrite about 80 descriptions (but description will be almost the same) ? Best regards.
Intermediate & Advanced SEO | | AymanH0 -
Does duplicate content penalize the whole site or just the pages affected?
I am trying to assess the impact of duplicate content on our e-commerce site and I need to know if the duplicate content is affecting only the pages that contain the dupe content or does it affect the whole site? In Google that is. But of course. Lol
Intermediate & Advanced SEO | | bjs20100 -
Bi-Lingual Site: Lack of Translated Content & Duplicate Content
One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David
Intermediate & Advanced SEO | | djreich0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0 -
Cross-Domain Canonical and duplicate content
Hi Mozfans! I'm working on seo for one of my new clients and it's a job site (i call the site: Site A).
Intermediate & Advanced SEO | | MaartenvandenBos
The thing is that the client has about 3 sites with the same Jobs on it. I'm pointing a duplicate content problem, only the thing is the jobs on the other sites must stay there. So the client doesn't want to remove them. There is a other (non ranking) reason why. Can i solve the duplicate content problem with a cross-domain canonical?
The client wants to rank well with the site i'm working on (Site A). Thanks! Rand did a whiteboard friday about Cross-Domain Canonical
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday0