Site been plagiarised - duplicate content
-
Hi,
I look after two websites, one sells commercial mortgages the other sells residential mortgages.
We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right.
I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word.
I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well.
I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory?
I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view.
Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario!
In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy.
Any advice would be greatly appreciated.
Thanks,
Amelia
-
Hi David,
I hope you had a good weekend?
Thank you for all your help! I reported them to Google using the link you posted and already the other site's URLs that had copied us have been removed and our pages have been put back in the index.
I have to say I am absolutely astounded that Google responded so quickly!
Yes, that is us on Google + and my personal Google + is here: https://plus.google.com/u/0/+AmeliaVargo/posts/.
Thank you again for your help thus far, and for your kind offer of more help should we need it!
Have a great day,
Amelia
-
Glad I could help. I really hope you get this all sorted out. Good news is, you found the problem and are working to fix it, which is much better than most people would have been able to do. Have high hopes!
"the two pages they've copied are really important sales pages (remortgage and first time buyer) so for us, it's a massive shame. "
There is still a way to promote those pages, just not using Google organic to do so. Modify some of the content, create a press release, promote that page using social networks, and drive interest to that page and your site the old fashioned way. PPC is always an option as well. Remember, there are many ways to get traffic, don't lose hope or the vision.
On a side note, is this your company?
https://plus.google.com/u/0/+TurnkeymortgagesCoUk/postsI can add you to my circles, so if you have any more issues or need additional help just let me know.
-
I just wanted to post up a message to everyone who has helped me with this problem.
First of all, please accept my sincere thanks. I REALLY appreciate everyone's contribution.
Now, I just wanted to tell you all what, as a company, we've decided to do.
- We've written letters to: The company that copied us, their web designer and their host, asking them to remove the copied content within 14 days of the letters.
- We've 'reported' them to Google, via one of the links that David posted (https://support.google.com/legal/troubleshooter/1114905?hl=en)
- We've reported them for scraping, using the link that Paddy posted
Hopefully, this problem will go away, but I hate to think how much business we may have lost as a result - the two pages they've copied are really important sales pages (remortgage and first time buyer) so for us, it's a massive shame.
Best wishes, and I hope you all have a great weekend!
Amelia
-
Thank you David.
-
Once their version is removed/rewritten, resubmit your site to Google in every way that you can.
1. Fetch as Google
2. Change sitemap created dates to current day
3. Change crawl frequency in sitemap to daily
4. Check for proper 301 redirects from old pages, when you moved/modified the site to separate branding.
5. Submit the URL in question to Google, and letting them know that someone has copied your site's content. They should be able to see that your was created first.Here are a few links to help:
https://www.google.com/webmasters/tools/dmca-notice <<< start there
https://support.google.com/legal/troubleshooter/1114905?hl=en
http://blog.kissmetrics.com/find-remove-stolen-content/
http://www.orclage.com/report-remove-stolen-duplicate-content-google/
-
Thank you Paddy! Much appreciated, and thank you for helping me again!
-
Ahh, good one.
-
Don't forget about this:
https://docs.google.com/forms/d/1Pw1KVOVRyr4a7ezj_6SHghnX1Y6bp1SOVmy60QjkF0Y/viewform
-
Thank you, you've helped me no end.
Have a great weekend
-
It really depends on the web host whether they will follow it or not. Some that are soley based in the UK might not. If they have US based servers or the site is hosted in the US more than likely they will. It is worth a shot though, I try to rattle as many cages as possible. Here is a little info on filing them in the UK https://www.teneric.co.uk/marketing/copyright-infringement.html
-
Hi Lesley,
Yes, I redirected everything using 301 redirects - page to page. I also used the change of address tool in webmaster tools for the site that changed domains.
I don't know if using DMCA will be appropriate - isn't that a US-only thing or can site owners in the UK use it too? If I can, I will use it.
Thank you for responding - I really do appreciate your help.
Best wishes,
Amelia
-
After they drop out of the searches google will index your site as a the canonical site with that content on it. So that part happens manually. Also, when you relaunched, did you redirect everything from the old site? That helps preserve link juice and at the same time gives search engines a pointer that the address of a page has changed to this new address.
One thing I would suggest is having a DMCA take down notice draft and sent to the host as well. If the other people you send letters to tell you to go pound sand, normally the host does not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
How can a website have multiple pages of duplicate content - still rank?
Can you have a website with multiple pages of the exact same copy, (being different locations of a franchise business), and still be able to rank for each individual franchise? Is that possible?
Intermediate & Advanced SEO | | OhYeahSteve0 -
Scraping / Duplicate Content Question
Hi All, I understanding the way to protect content such as a feature rich article is to create authorship by linking to your Google+ account. My Question
Intermediate & Advanced SEO | | Mark_Ch
You have created a webpage that is informative but not worthy to be an article, hence no need create authorship in Google+
If a competitor comes along and steals this content word for word, something similar, creates their own Google+ page, can you be penalised? Is there any way to protect yourself without authorship and Google+? Regards Mark0 -
3rd Party hosted whitepapers — bad idea? Duplicate content?
It is common the B2B world to have 3rd parties host your whitepapers for added exposure. Is this a bad practice from an SEO point of view? Is the expectation that the 3rd parties use rel=canonical tags? I doubt most of them do . . .
Intermediate & Advanced SEO | | BlueLinkERP0 -
Temporary Duplicate Sites - Do anything?
Hi Mozzers - We are about to move one of our sites to Joomla. This is one of our main sites and it receives about 40 million visits a month, so the dev team is a little concerned about how the new site will handle the load. Dev's solution, since we control about 2/3 of that traffic through our own internal email and cross promotions, is to launch the new site and not take down the old site. They would leave the old site on its current URL and make the new site something like new.sub.site.com. Traffic we control would continue to the old site, traffic that we detect as new would be re-directed to the new site. Over time (the think about 3-4 months) they would shift the traffic all to the new site, then eventually change the URL of the new site to be the URL of the old site and be done. So this seems to be at the outset a duplicate content (whole site) issue to start with. I think the best course of action is try to preserve all SEO value on the old URL since the new URL will eventually go away and become the old URL. I could consider on the new site no-crawl/no-index tags temporarily while both sites exist, but would that be risky since that site will eventually need to take those tags off and become the only site? Rel=canonical temporarily from the new site to the old site also seems like it might not be the best answer. Any thoughts?
Intermediate & Advanced SEO | | Kenn_Gold0 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
Wordpress Duplicate Content
We have recently moved our company's blog to Wordpress on a subdomain (we utilize the Yoast SEO plugin). We are now experiencing an ever-growing volume of crawl errors (nearly 300 4xx now) for pages that do not exist to begin with. I believe it may have something to do with having the blog on a subdomain and/or our yoast seo plugin's indexation archives (author, category, etc) --- we currently have Subpages of archives and taxonomies, and category archives in use. I'm not as familiar with Wordpress and the Yoast SEO plugin as I am with other CMS' so any help in this matter would be greatly appreciated. I can PM further info if necessary. Thank you for the help in advance.
Intermediate & Advanced SEO | | BethA0 -
Duplicate content for area listings
Hi, I was slightly affected by the panda update on the 14th oct generaly dropping by about 5-8 spots in the serps for my main keywords, since then I've been giving my site a good looking over. On a site I've got city listings urls for certain widget companys, the thing is many areas and thus urls will have the same company listed. What would be the best way of solving this duplicate content as google may be seeing it? I was thinking of one page per company and prominenly listing the areas they operate so still hopefully get ranked for area searches. But i'd be losing the city names in the url as I've got them now for example: mywidgetsite.com/findmagicwidgets/new-york.html mywidgetsite.com/findmagicwidgets/atlanta.html Any ideas on how best to proceed? Cheers!
Intermediate & Advanced SEO | | NetGeek0