Duplicate Content, Same Company?
-
Hello Moz Community,
I am doing work for a company and they have multiple locations.
For example, examplenewyork.com, examplesanfrancisco.com, etc.
They also have the same content on certain pages within each website.
For example, examplenewyork.com/page-a has the same content as examplesanfrancisco.com/page-a
Does this duplicate content negatively impact us? Or could we rank for each page within each location parameter (for example, people in new york search page-a would see our web page and people in san fran search page-a would see our web page)?
I hope this is clear.
Thanks,
Cole
-
Thanks all.
-
Sorry, I lost track of the fact that you were talking about dupe content on multiple domains, vs. on the same domain. The same logic basically applies. However, when you're talking about essentially duplicating entire domains registered to the same owner, there can be somewhat more of a risk that the original content gets discounted (or in such cases, penalized) along with the duplicate.
If you have a main site that seems to be doing OK in the search results, you may consider keeping that domain and it's content, while eliminating/redirecting the other domains and revising their content for use on the domain you're keeping.
-
Chris makes a fantastic point here.
You almost need to detach "what's reasonable" from what Google wants sometimes. Chris is right - why shouldn't those two pages have the same content? But we're dealing with algorithms mainly, not reasoning.
-
Cole,
I'm going to say roughly the same thing as the soon-to-be-guru Tom but give you somewhat of a different spin on it.
It's completely understandable that anyone with a website would feel that the the content applicable to one city would also apply to another city as well, so what's the harm in just switching out the city names? There shouldn't be really, and in most cases there is no actual harm, in it.
However, while Google's search engine makes it possible for customers in multiple cities to actually be able to seek out and find content you've "tailored" to them, it also makes it possible for other marketers to do the same as you've done--thus competition for keywords increases dramatically. On a small scale, google doesn't want to penalize, per se, a whole site for such practices, but it does want to differentiate that which might be original content from that which might be duplicates of the original and in doing so, be able to rank the original, while discounting duplicates.
To get around this "hurdle" you have to treat each of your pages as unique entities with unique values to each of your target markets. That way, content for each page ends up being unique and Google's algorithm can prioritize all the competitors' pages uniformly according to how relevant and valuable they are to the target audience.
-
Hey Cole
-
The more you do change, the less risk involved. Some might tell you that if you change the content enough to pass "copyscape" or other online plagiarism tools, that would protect you from a penalty. I find that to be slightly ridiculous - why would Google judge by those external standards? The more you can change, the better in my opinion (but I can totally sympathise with the work that entails)
-
Google will know you own the websites if you link them together, share GA code, host them together, contain the same company details and so on - but my question is why would you want to do that? I think if you tried to tell Google you owned all the sites they would come out you even harder, as they could see it as you being manipulative.
To that point, others will recommend that you only use one domain and target different KWs or locations on different pages/subfolders/subdomains, as it'll look less like a link network. Downside of that is getting Google local listings for each page/location can be a bit of a pain if the pages all come from one domain.
It's not really my place to comment on your strategy and what you should/should not be doing, but suffice to say if you go with individual domains for each location, you should aim to make those domains (and their copy) as unique and independent as possible.
-
-
Hey Tom,
The keywords we are competing for aren't very competitive.
Two follow up questions:
1.) To what length should we change the content? For example, is it a matter of a few words (location based) or is it more of altering each content on the page. I guess my question deals with the scope of the content change.
2.) Is there a way to let Google know we own all the websites? I had href lang in mind here. This may not be possible; I just wanted to ask.
Tom, thanks so much for your help.
Cole
-
Hi Cole
That kind of duplication will almost certainly negatively impact your ability to rank.
It's the kind of dupe content that Google hates - the kind that's deliberately manipulative and used by sites just trying to rank for as many different KWs or locations as possible, without trying to give people a unique user experience.
Not to say that you couldn't possibly rank like this (I've seen it happen and will probably see it again in the future), but you're leaving yourself wide open to a Panda penalty and, as such, I'd highly recommend that you cater each site and each landing page to your particular audience. Even by doing that, not only will you be making it unique but you would dramatically improve your chances of ranking by mentioning local things for a local page.
Give each page unique copy and really tailor it to your local audience.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix thin content issue?
Hello! I've checked my website via Moz and received "thin content" issue: "Your page is considered to have "thin content" if it has less than 50 words" But I definitely know that we have 5 text blocks with unique content, each block consist of more than 50 words. Do you have any ideas what may cause this issue? Thanks in advance, Yana
On-Page Optimization | | yanamazault0 -
ECommerce Duplicate content on product pages (eg delivery info, contact details etc)
Hi, Running a Magento site and wanted to check about duplicate page content. We have 1000+ product pages and it has been suggested to remove some of the "duplicated content" which displays on every product page and replace this with an image of the same text content. By this I am talking about content which is for promo/customer purposes and is displayed on every page. eg: "If you find our products cheaper elsewhere then please click below to get your price match...... etc", and a chunk of text for the "Delivery Tab Information" and "Contact Tab Information" on each and every product page. A SEO company has suggested to turn this content into images. Does anyone have thoughts on this please?
On-Page Optimization | | Ampweb0 -
Should I change PDF content?
Hi everybody, My Website is ranking well for several keywords and long-tail keywords. However, all these visits are going directly to some .PDF guides that exist on our products and information on industry sectors the company is based around. I feel the PDF's are bad simply because they dont offer easy interaction with the rest of the website. I am considering making each PDF into a webpage but am not 100% sure of the pro's and cons of doing so. I will still need to the PDF's accessible for user to download but don't want my new webpages to get tagged as duplicate content. Is it possible to,
On-Page Optimization | | ATP
1 - change the PDF's so they send any link authority to the new webpage
2 - make google aware that I want the webpage not the PDF to be the "ranking" page What is the likely hood of destroying my rank for these keywords on the PDF by making these changes and then not being able to rank the webpage for the same keywords? It would be pointless if I just lost all the traffic lol.0 -
Nice looking ecommerce menus with featured product categories - bad for SEO due to duplicate content?
My ecommerce website has menus which contain 'featured product sub-categories'. These are shown alongside the other product sub-category links. Each 'featured product category' includes a link, an image (with link) and some text. All menu content is visible to search engines. These menus look nice and probably encourage CTR (not tested!) but are they bad for SEO?
On-Page Optimization | | Coraltoes771 -
Will google regards www.example.com and www.example.com?331457 as the duplicate content?
Our site has some affiliates, and the affiliate id is the suffix following with the url "?xxxxxx". I can see Google Analytics regards www.example.com and www.example.com?331457 as the different page, but in fact they are exactly the same, the version www.example.com?331457 is the visit from our affiliate site. And yesterday I start up my Moz Pro membership, and in the crawl issues I see SEOMoz thinks www.example.com and www.example.com?331457 are duplicate content. Is this really an issue? Will the search engine thinks these two pages are duplicate content?? Thanks you guys My first question here, not too dumb I hope. -----------------Update---------------------- I should explain how our affiliates work. We are an eBook related software company, and anyone can apply an affiliate account on the transaction platform "RegNow" even without our permission because we have opened the affiliate door. When a visitor come to our order page from an affiliate site, the url will add the affiliate ID suffix "?xxxxxx", and it's combined in cookies. After the deal is done, the affiliate gets his commission. So no matter how I customize the url with URL Builder, there must be the suffix "?xxxxxx". It's the ID of our affiliate, or they will get nothing. So the key point is, will the suffix "?331457" makes Google think www.example.com and www.example.com?331457 are different pages and duplicate content?
On-Page Optimization | | JonnyGreenwood0 -
Issue: Duplicate Page Title
When you are in Error status for Duplicate Page Titles - but it is because of the root domain: Example.com and Example.com/index How to you go about changing the title of the same page without looking un-natural. My client has built his site with the - index file pulling to the root - but the crawlers are seeing TWO separate pages - when in reality they are the same. Riddle me this batman?
On-Page Optimization | | Chenzo0 -
Duplicate Title & Content in WordPress
I'm getting a lot of Crawl Errors due to duplicate content and duplicate title because of category and tag posts in WordPress. I rebuilt the sitemap and said to exclude category and tags, should that clear up the issue? I've also went through and did NO INDEX and NO FOLLOW for all categories and posts. Any thoughts on this issue?
On-Page Optimization | | seantgreen0 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0