Competitor ranking well with duplicate content—what are my options?
-
A competitor is ranking #1 and #3 for a search term (see attached) by publishing two separate sites with the same content. They've modified the title of the page, and serve it in a different design, but are using their branded domain and a keyword-rich domain to gain multiple rankings. This has been going on for years, and I've always told myself that Google would eventually catch it with an algorithm update, but that doesn't seem to be happening. Does anyone know of other options? It doesn't seem like this falls under any of the categories that Google lists on their web spam report page—is there any other way to get bring this up with the powers that be, or is it something that I just have to live with and hope that Google figures out some day?
Any advice would help. Thanks!
how_to_become_a_home_inspector_-_Google_Search_2015-01-15_18-45-06.jpg
-
Yeah, I don't really like the 'theory' I listed, but sometimes you aren't left with very many options. If a business has been trying to go out it organically for as long as the OP has, then maybe it is time to consider something farther outside the box.
-
Ray's advice, while a little devious and maybe a touch shady, is likely what I'd suggest as well. Nothing worse than being outranked this way.
Also:
https://www.google.com/webmasters/tools/spamreportform?hl=en
-
Well, this is unfortunate to have happen - and really hard to look at as an SEO. It puts you in a tough position.
The content is obviously duplicate, they don't even bother to update the brand name in much of it, nor the meta data and other on-page factors completely.
They are hosted on the same account, same IP address.
Their NAP is the exact same.
They even share Google Analytics IDs. Maybe someone knows why this stays out of sight of the algo updates, but I'm not sure.
In theory, you could purchase a keyword rich domain, copy their URL structure, and copy their content. Then see if that website ranks in the SERPs. if so then I would do it with 8 more websites and dominate the SERPs. Once the page is spammed full of the same content, Google should catch on and penalize all the sites. All the new duplicate sites should be on a different web host than your primary domain. Just a theory though...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is a recent hack or the disavow tool causing my alarming dropping in rankings!
My business site has been very successful organically for many years. Just recently we got hit with a spam hack and it was resolved within 3 days. However now my rankings are plummeting and I am so stressed out! So here is some timeline information any info would help: Sept. 4th hack first detected on Google Sept. 7th site completely clean, reconsideration accepted, spam content and links removed. Manual actions cleared. Rankings at this time have not been affected. Sept. 11th disavowed a few incoming links that were completely spam. (In hindsight I know this could have been the beginning of the end using this tool) Sept. 21st start to notice first significant drop in rankings and I went into GWT and downloaded latest 1000 links, I realized ALL of these were either hacked sites as well with spam content linking to our now delete spam content or inappropriate adult content. Sept. 22 Disavowed the 1000 domains (there are still probably 1000-2000 more) As of today rankings have SIGNIFICANTLY dropped, I have resubmitted sitemaps, image sitemaps, fetch and rendered as google. I'm stressing out incredibly and feel like I have made an error and that my site will never recover. I've worked using ALL white hat seo and the site used to rank very well top of page one for almost all my keywords. I feel lost and don't know what else I can do - and I know many say wait but it feels like forever. Is it possible that I didn't make a mistake using the disavow and that Google just took a while to penalize for the hack? Please any advice or experiences I would love to hear and appreciate so much anyone who takes the time to respond.
White Hat / Black Hat SEO | | seounicorn0 -
Keywords Ranking Fluctuate
Hi Moz Community! I've been working with a client’s website for about a year now. They were hit with the original Panda/ Penguin update because of some spammy links. We have analysed and disavow sapmmy links edit anchor tag and landing pages. We've made a decent climb back while working on it keywords ranking was improved some of them are from out of 500 to top 30, but not a full recovery. There are some weird things happening that I would love some insight into. 1. Ranking for keywords are again dropping and traffic is gone: I notice keyword ranking is dropping and some of the keywords which are on 2 and 3 page, now on 9<sup>th</sup> page. And also traffic is down. Any help would be appreciated! Regards, S
White Hat / Black Hat SEO | | ShaunPhilips0 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
Rank Drop Possibly due to links but no warning in GWT
Hello, We've been experiencing rank drop in all major keywords for the past 9 months. I've had different people say different things here at Moz about how backlinks effect rank drop. Brilliant answers, but different opinions. Nothing is showing up in GWT for this site. Here's the backlink breakdown: 72 linking root domains. 20 of those are blogs. These blogs have no backlinks in and of themselves, and were created originally as easy links. Not white hat stuff. Three additional root domains are still paid links in this profile, though all but one was made to look editorial. The one that doesn't look editorial has links sprinkled throughout their website, among other paid links. The rest of the linking root domains (49) are legitimate. Again, nothing shows up in GWT. We had 96 root domains last March but in March of 2013 we cut most of the paid links and half (20) of the blogs. This brought our ranking down immediately by 2 or 3 slots. We've been slipping every since. I would like people to speak from experience and let me know if you think the backlinks could be causing the ranking drop and what to do about it. Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Pagination for Search Results Pages: Noindex/Follow, Rel=Canonical, Ajax Best Option?
I have a site with paginated search result pages. What I've done is noindex/follow them and I've placed the rel=canonical tag on page2, page3, page4, etc pointing back to the main/first search result page. These paginated search result pages aren't visible to the user (since I'm not technically selling products, just providing different images to the user), and I've added a text link on the bottom of the first/main search result page that says "click here to load more" and once clicked, it automatically lists more images on the page (ajax). Is this a proper strategy? Also, for a site that does sell products, would simply noindexing/following the search results/paginated pages and placing the canonical tag on the paginated pages pointing back to the main search result page suffice? I would love feedback on if this is a proper method/strategy to keep Google happy. Side question - When the robots go through a page that is noindexed/followed, are they taking into consideration the text on those pages, page titles, meta tags, etc, or are they only worrying about the actual links within that page and passing link juice through them all?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Homepage bombed from rankings 2
I've had some varying advice on here regarding the best way to proceed with [i'll PM the URL] which was hit by Penguin 2.0. There were previous issues with the homepage and before the 22nd had started creating new decent links. Some have suggested to ditch the domain and start again. There are several reasons not to and branding is the deciding factor at this stage. I'm going down the route of initially trying to manually remove links and then follow on with disavow. I would really appreciate another pair of eyes taking a quick look to see if i'm missing anything other than a dodgy link profile.
White Hat / Black Hat SEO | | MickEdwards0 -
Do bad links "hurt" your ranking or just not add any value
Do bad links "hurt" your ranking or just not add any value. By this I mean, if you do have links from link farms and bad neighbourhoods, would it effectively pull you down in search engine rankings. Or is it more that it's just a waste of time to get these links, as it adds no value to your ranking. Are google saying avoid them because it will not have a positive effect, or avoid them becuase it will have a negative effect. I am under the opinion that it will not harm, but it will not help either. I think this because at the end of the day you are not 100% in control of your inbound links, any bad site could add you and if a competitor, god forbid, wanted to play some black hat games, couldn't they just add you to thousands of bad sites to pull your ranking down? Interested to hear your opinions on the matter, or any "facts" if they are out there.
White Hat / Black Hat SEO | | esendex0