Site been plagiarised - duplicate content
-
Hi,
I look after two websites, one sells commercial mortgages the other sells residential mortgages.
We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right.
I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word.
I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well.
I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory?
I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view.
Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario!
In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy.
Any advice would be greatly appreciated.
Thanks,
Amelia
-
Hi David,
I hope you had a good weekend?
Thank you for all your help! I reported them to Google using the link you posted and already the other site's URLs that had copied us have been removed and our pages have been put back in the index.
I have to say I am absolutely astounded that Google responded so quickly!
Yes, that is us on Google + and my personal Google + is here: https://plus.google.com/u/0/+AmeliaVargo/posts/.
Thank you again for your help thus far, and for your kind offer of more help should we need it!
Have a great day,
Amelia
-
Glad I could help. I really hope you get this all sorted out. Good news is, you found the problem and are working to fix it, which is much better than most people would have been able to do. Have high hopes!
"the two pages they've copied are really important sales pages (remortgage and first time buyer) so for us, it's a massive shame. "
There is still a way to promote those pages, just not using Google organic to do so. Modify some of the content, create a press release, promote that page using social networks, and drive interest to that page and your site the old fashioned way. PPC is always an option as well. Remember, there are many ways to get traffic, don't lose hope or the vision.
On a side note, is this your company?
https://plus.google.com/u/0/+TurnkeymortgagesCoUk/postsI can add you to my circles, so if you have any more issues or need additional help just let me know.
-
I just wanted to post up a message to everyone who has helped me with this problem.
First of all, please accept my sincere thanks. I REALLY appreciate everyone's contribution.
Now, I just wanted to tell you all what, as a company, we've decided to do.
- We've written letters to: The company that copied us, their web designer and their host, asking them to remove the copied content within 14 days of the letters.
- We've 'reported' them to Google, via one of the links that David posted (https://support.google.com/legal/troubleshooter/1114905?hl=en)
- We've reported them for scraping, using the link that Paddy posted
Hopefully, this problem will go away, but I hate to think how much business we may have lost as a result - the two pages they've copied are really important sales pages (remortgage and first time buyer) so for us, it's a massive shame.
Best wishes, and I hope you all have a great weekend!
Amelia
-
Thank you David.
-
Once their version is removed/rewritten, resubmit your site to Google in every way that you can.
1. Fetch as Google
2. Change sitemap created dates to current day
3. Change crawl frequency in sitemap to daily
4. Check for proper 301 redirects from old pages, when you moved/modified the site to separate branding.
5. Submit the URL in question to Google, and letting them know that someone has copied your site's content. They should be able to see that your was created first.Here are a few links to help:
https://www.google.com/webmasters/tools/dmca-notice <<< start there
https://support.google.com/legal/troubleshooter/1114905?hl=en
http://blog.kissmetrics.com/find-remove-stolen-content/
http://www.orclage.com/report-remove-stolen-duplicate-content-google/
-
Thank you Paddy! Much appreciated, and thank you for helping me again!
-
Ahh, good one.
-
Don't forget about this:
https://docs.google.com/forms/d/1Pw1KVOVRyr4a7ezj_6SHghnX1Y6bp1SOVmy60QjkF0Y/viewform
-
Thank you, you've helped me no end.
Have a great weekend
-
It really depends on the web host whether they will follow it or not. Some that are soley based in the UK might not. If they have US based servers or the site is hosted in the US more than likely they will. It is worth a shot though, I try to rattle as many cages as possible. Here is a little info on filing them in the UK https://www.teneric.co.uk/marketing/copyright-infringement.html
-
Hi Lesley,
Yes, I redirected everything using 301 redirects - page to page. I also used the change of address tool in webmaster tools for the site that changed domains.
I don't know if using DMCA will be appropriate - isn't that a US-only thing or can site owners in the UK use it too? If I can, I will use it.
Thank you for responding - I really do appreciate your help.
Best wishes,
Amelia
-
After they drop out of the searches google will index your site as a the canonical site with that content on it. So that part happens manually. Also, when you relaunched, did you redirect everything from the old site? That helps preserve link juice and at the same time gives search engines a pointer that the address of a page has changed to this new address.
One thing I would suggest is having a DMCA take down notice draft and sent to the host as well. If the other people you send letters to tell you to go pound sand, normally the host does not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
Let's say a blog is publishing original content. Now let's say a second blog steals that original content via bot and publishes it as it's own. Now further assume the original blog doesn't notice this for several years. How much damage could this do to blog A for Google results? Any opinions?
Intermediate & Advanced SEO | | CYNOT0 -
Bigcommerce & Blog Tags causing Duplicate Content?
Curious why moz would pick up our blog tags as causing duplicate content, when each blog has a rel canonical tag pointing to either the blog post itself and on the tag pages points to the blog as a whole. Kinda want to get rid of the tags in general now, but also feel they can add some extra value to UX later on when we have many more blog posts. Curious if anyone knows a way around this or even a best solution practice when faced with such odd issues? I can see why the duplicate content would happen, but when grouping content into categories?
Intermediate & Advanced SEO | | Deacyde0 -
Duplicate Internal Content on E-Commerce Website
Hi, I find my e-commerce pharmacy website is full of little snippets of duplicate content. In particular: -delivery info widget repeated on all the product pages -product category information repeated product pages (e.g. all medicines belonging to a certain category of medicines have identical side effects and I also include a generic snippet of the condition the medicine treats) Do you think it will harm my rankings to do this?
Intermediate & Advanced SEO | | deelo5550 -
PDF for link building - avoiding duplicate content
Hello, We've got an article that we're turning into a PDF. Both the article and the PDF will be on our site. This PDF is a good, thorough piece of content on how to choose a product. We're going to strip out all of the links to our in the article and create this PDF so that it will be good for people to reference and even print. Then we're going to do link building through outreach since people will find the article and PDF useful. My question is, how do I use rel="canonical" to make sure that the article and PDF aren't duplicate content? Thanks.
Intermediate & Advanced SEO | | BobGW0 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
HTTPS Duplicate Content?
I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?
Intermediate & Advanced SEO | | QuickLearnTraining0 -
ECommerce syndication & duplicate content
We have an eCommerce website with original software products. We want to syndicate our content to partner and affiliate websites, but are worried about the effect of duplicate content all over the web. Note that this is a relatively high profile project, where thousands of sites will be listing hundreds of our products, with the exact same name, description, tags, etc. We read the wonderful and relevant post by Kate Morris on this topic (here: http://mz.cm/nXho02) and we realize the duplicate content is never the best option. Some concrete questions we're trying to figure out: 1. Are we risking penalties of any sort? 2. We can potentially get tens of thousands of links from this concept, all with duplicate content around them, but from PR3-6 sites, some with lots of authority. What will affect our site more - the quantity of mediocre links (good) or the duplicate content around them (bad)? 3. Should we sacrifice SEO for a good business idea?
Intermediate & Advanced SEO | | erangalp0 -
Cross-Domain Canonical and duplicate content
Hi Mozfans! I'm working on seo for one of my new clients and it's a job site (i call the site: Site A).
Intermediate & Advanced SEO | | MaartenvandenBos
The thing is that the client has about 3 sites with the same Jobs on it. I'm pointing a duplicate content problem, only the thing is the jobs on the other sites must stay there. So the client doesn't want to remove them. There is a other (non ranking) reason why. Can i solve the duplicate content problem with a cross-domain canonical?
The client wants to rank well with the site i'm working on (Site A). Thanks! Rand did a whiteboard friday about Cross-Domain Canonical
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday0