Competitors and Duplicate Content
-
I'm curious to get people's opinion on this.
One of our clients (Company A) has a competitor that's using duplicate sites to rank. They're using "www.companyA.com" and "www.CompanyAIndustryTown.com" (actually, several of the variations). It's basically duplicate content, with maybe a town name inserted or changed somewhere on the page. I was always told that this is not a wise idea. They started doing this in the past month or so when they had a site redesign. So far, it's working pretty well for them. So, here's my questions:
-Would you address this directly (report to Google, etc.)?
-Would you ignore this?
-Do you think it's going to backfire soon?
There's another company (Company B) that's using another practice- using separate pages on their domain to address different towns, and using those as landing pages. Similar, in that a lot of the content is the same, just some town names and minor details changed. All on the same domain though. Would the same apply to that?
Thanks for your insight!
-
The only long lasting way to rank for local specific pages is to offer truly unique content on those pages, and build unique links to those pages.
The two methods you mentioned here, using near duplicate sites and pages, may work for a short time or in non-competitive niches. It may also work somewhat if a very strong link profile is backing it up... but in general these sorts of tricks usually result in a drop in rankings. If not now, then during an upcoming algorythm change.
Often times, misguided webmasters think they are doing the right thing in launching these sites and pages, and no ill intent is intended. Unless the pages are obviously spam or doorway pages, then in my opinion it's probably not worth it reporting them to Google, but that decision is of course best left to each individual.
Read more about doorway pages: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355
Consider how Yelp has 100s of pages about dentist, at least one page for every major city in America. Although the pages are similar, they are each filled with unique content and all have unique links pointing to them. Each delivers a similar message, but provides unique value based on that particular location.
Add unique value to each location specific page, and you're doing great.
-
Unfortunately, this isn't a method likely to work.
Most of the time, if you insert canonical tags on near similar pages, and Google interprets those canonical correctly, then they tend to index and rank the page that the canonical points to. So all of those other pages would have little or no search engine visibility whatsoever.
Not a good technique if you're trying to rank individual pages.
-
So ARE you suggesting that for local city pages that you add the canonical tag to point to the home page?
I guess I'm a little confused on this as Adam is?
Can you explain your thoughts behind this?
-
So let me clarify then, if they have (on same domain) multiple pages with near duplicate content, mostly changing names of cities, but use rel:canonical, they will still have the SEO benefit of ranking for different towns, but it won't be seen as duplicate content?
And then the multiple domain situation...that's just a wait and see.
-
The pages with the city specific information but similar content are pretty much the perfect space for a canonical tag. If you feel that they haven't been penalized, then this is probably the method they are using for hosting the same content.
-
here is an example of sites that have been using duplicate content with a few word changes
http://www.seomoz.org/q/duplicate-exact-match-domains-flagged-by-google-need-help-reinclusion
-
Having multiple sites with duplicate content is a bad idea as it affects your search engine rankings. The company is likely to be using bad SEO practice and soon google bots will pick this up and the domain will get penalised.
You can report to Google, but in most cases Google picks up sites that are using bad SEO techniques.
There is no harm in using separate pages on domains name to address they operate in different towns as this helps the site being found for local searches, but having content that is again duplicated and only a few words changed Google will pick this up.
Always remember Content is KING!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Technical : Duplicate content and domain name change
Hi guys, So, this is a tricky one. My server team just made quite a big mistake :We are a big We are a big magento ecommerce website, selling well, with about 6000 products. And we are about to change our domaine name for administrative reasons. Let's call the current site : current.com and the future one : future.com Right, here is the issue Connecting to the search console, I saw future.com sending 11.000 links to current.com. At the same time DA was hit by 7 points. I realized future.com was uncorrectly redirected and showed a duplicated site or current.com. We corrected this, and future.com now shows a landing page until we make the domain name change. I was wondering what is the best way to avoid the penalty now and what can be the consequences when changing domain name. Should I set an alias on search console or something ? Thanks
White Hat / Black Hat SEO | | Kepass0 -
Strange strategy from a competitor. Is this "Google Friendly"?
Hi all,We have a client from a very competitive industry (car insurance) that ranks first for almost every important and relevant keyword related to car insurance.
White Hat / Black Hat SEO | | sixam
But they could always be doing a good job. A few days ago i found this: http://logo.force.com/ The competitor website is: http://www.logo.pt/ The competitor name is: Logo What I found strange is the fact that both websites are the same, except the fact that the first is in a sub-domain and have important links pointing to the original website (www.logo.pt) So my question is, is this a "google friendly" (and fair) technique? why this competitor has such good results? Thanks in advance!! I look forward to hearing from you guys0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Is it a duplicate content ?
Hi
White Hat / Black Hat SEO | | loumiPlease check this link : http : // www . speedguide . net/news/yahoo-acquires-email-management-app-xobni-5252 it's a post where the admin just write the first 200-300 words and then insert the "read more here" which links to the original post This make the website active as the admin always add new content but is this not against google rules as it's a duplicate content ?? Can you tell me the name of this strategy ? Is this really work to make the website active ??
0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Content ideas?
We run a printing company and we are struggling to come up with unique content people will actually want to know, is there any way of getting the ball rolling? We were thinking of ideas such as exhibition guide but this seems to have been overdone. Any help would be appreciated.
White Hat / Black Hat SEO | | BobAnderson0 -
Using Programmatic Content
My company has been approached a number of times by computer generated content providers (like Narrative Science and Comtex). They are providing computer generated content to a number of big name websites. Does anyone have any experience working with companies like this? We were burned by the first panda update because we were busing boilerplate forms for content
White Hat / Black Hat SEO | | SuperMikeLewis0 -
Why does my competitor rank so well with so many paid/traded links?
Greetings everyone! I've really been enjoying my Moz membership these past few weeks after studying my data and comparing it with my competitors I think it's high time I started asking some questions. The website I manage has a very good ranking history but over the past year we've seen a slight decline in our SERP positions. I don't think this has anything to do with on-page optimization but rather with our link profile. We have only about 10k links total while they have 175k - our mozranks are nearly identical, but his moztrust is 4.46 and our's is 3.51. I am guessing, on our end, I need to remove some of these low-quality nofollow links (though I'll be honest I have no idea how we obtained them to begin with) but what I don't understand is how our competitor is ranking so well because when I browse their link profile, it is filled with paid link and traded link directories that don't appear to be penalized for what they are. I was under the impression that this was bad SEO, but now I am thinking I should just play his own game and submit to these sites too. Looking for any advice or ideas on a better way to compete... ❤ Jennifer
White Hat / Black Hat SEO | | Virage0