Site been plagiarised - duplicate content
-
Hi,
I look after two websites, one sells commercial mortgages the other sells residential mortgages.
We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right.
I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word.
I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well.
I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory?
I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view.
Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario!
In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy.
Any advice would be greatly appreciated.
Thanks,
Amelia
-
Hi David,
I hope you had a good weekend?
Thank you for all your help! I reported them to Google using the link you posted and already the other site's URLs that had copied us have been removed and our pages have been put back in the index.
I have to say I am absolutely astounded that Google responded so quickly!
Yes, that is us on Google + and my personal Google + is here: https://plus.google.com/u/0/+AmeliaVargo/posts/.
Thank you again for your help thus far, and for your kind offer of more help should we need it!
Have a great day,
Amelia
-
Glad I could help. I really hope you get this all sorted out. Good news is, you found the problem and are working to fix it, which is much better than most people would have been able to do. Have high hopes!
"the two pages they've copied are really important sales pages (remortgage and first time buyer) so for us, it's a massive shame. "
There is still a way to promote those pages, just not using Google organic to do so. Modify some of the content, create a press release, promote that page using social networks, and drive interest to that page and your site the old fashioned way. PPC is always an option as well. Remember, there are many ways to get traffic, don't lose hope or the vision.
On a side note, is this your company?
https://plus.google.com/u/0/+TurnkeymortgagesCoUk/postsI can add you to my circles, so if you have any more issues or need additional help just let me know.
-
I just wanted to post up a message to everyone who has helped me with this problem.
First of all, please accept my sincere thanks. I REALLY appreciate everyone's contribution.
Now, I just wanted to tell you all what, as a company, we've decided to do.
- We've written letters to: The company that copied us, their web designer and their host, asking them to remove the copied content within 14 days of the letters.
- We've 'reported' them to Google, via one of the links that David posted (https://support.google.com/legal/troubleshooter/1114905?hl=en)
- We've reported them for scraping, using the link that Paddy posted
Hopefully, this problem will go away, but I hate to think how much business we may have lost as a result - the two pages they've copied are really important sales pages (remortgage and first time buyer) so for us, it's a massive shame.
Best wishes, and I hope you all have a great weekend!
Amelia
-
Thank you David.
-
Once their version is removed/rewritten, resubmit your site to Google in every way that you can.
1. Fetch as Google
2. Change sitemap created dates to current day
3. Change crawl frequency in sitemap to daily
4. Check for proper 301 redirects from old pages, when you moved/modified the site to separate branding.
5. Submit the URL in question to Google, and letting them know that someone has copied your site's content. They should be able to see that your was created first.Here are a few links to help:
https://www.google.com/webmasters/tools/dmca-notice <<< start there
https://support.google.com/legal/troubleshooter/1114905?hl=en
http://blog.kissmetrics.com/find-remove-stolen-content/
http://www.orclage.com/report-remove-stolen-duplicate-content-google/
-
Thank you Paddy! Much appreciated, and thank you for helping me again!
-
Ahh, good one.
-
Don't forget about this:
https://docs.google.com/forms/d/1Pw1KVOVRyr4a7ezj_6SHghnX1Y6bp1SOVmy60QjkF0Y/viewform
-
Thank you, you've helped me no end.
Have a great weekend
-
It really depends on the web host whether they will follow it or not. Some that are soley based in the UK might not. If they have US based servers or the site is hosted in the US more than likely they will. It is worth a shot though, I try to rattle as many cages as possible. Here is a little info on filing them in the UK https://www.teneric.co.uk/marketing/copyright-infringement.html
-
Hi Lesley,
Yes, I redirected everything using 301 redirects - page to page. I also used the change of address tool in webmaster tools for the site that changed domains.
I don't know if using DMCA will be appropriate - isn't that a US-only thing or can site owners in the UK use it too? If I can, I will use it.
Thank you for responding - I really do appreciate your help.
Best wishes,
Amelia
-
After they drop out of the searches google will index your site as a the canonical site with that content on it. So that part happens manually. Also, when you relaunched, did you redirect everything from the old site? That helps preserve link juice and at the same time gives search engines a pointer that the address of a page has changed to this new address.
One thing I would suggest is having a DMCA take down notice draft and sent to the host as well. If the other people you send letters to tell you to go pound sand, normally the host does not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Dilemma for Category and Brand Pages
Hi, I have a online shop with categories such as: Trousers Shirts Shoes etc. But now I'm having a problem with further development.
Intermediate & Advanced SEO | | soralsokal
I'd like to introduce brand pages. In this case I would create new categories for Brand 1, Brand 2, etc... The text on categories and brand pages would be unique. But there will be an overlap in products. How do I deal with this from a duplicate content perspective? I'm appreciate your suggestions. Best, Robin0 -
Category Pages For Distributing Authority But Not Creating Duplicate Content
I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure. I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct? Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages. Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out. Thanks! Best.. Mike
Intermediate & Advanced SEO | | 945010 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
Magento Duplicate Content Recovery
Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks. We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return? Thanks!
Intermediate & Advanced SEO | | Jonnygeeuk0 -
Advice needed on how to handle alleged duplicate content and titles
Hi I wonder if anyone can advise on something that's got me scratching my head. The following are examples of urls which are deemed to have duplicate content and title tags. This causes around 8000 errors, which (for the most part) are valid urls because they provide different views on market data. e.g. #1 is the summary, while #2 is 'Holdings and Sector weightings'. #3 is odd because it's crawling the anchored link. I didn't think hashes were crawled? I'd like some advice on how best to handle these, because, really they're just queries against a master url and I'd like to remove the noise around duplicate errors so that I can focus on some other true duplicate url issues we have. Here's some example urls on the same page which are deemed as duplicates. 1) http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Holdings-and-sectors-weighting?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE&widgets=1 What's the best way to handle this?
Intermediate & Advanced SEO | | SearchPM0 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
Duplicate page content
Hi. I am getting error of having duplicate content on my website and pages its showing there are: www.mysitename.com www.mysitename.com/index.html As my best knowledge it only one page, I know this can be solved with some conical tag used in header, but do not know how. Can anyone please tell me about that code or any other way to get this solved. Thanks
Intermediate & Advanced SEO | | onlinetraffic0