Duplicate Content, Same Company?
-
Hello Moz Community,
I am doing work for a company and they have multiple locations.
For example, examplenewyork.com, examplesanfrancisco.com, etc.
They also have the same content on certain pages within each website.
For example, examplenewyork.com/page-a has the same content as examplesanfrancisco.com/page-a
Does this duplicate content negatively impact us? Or could we rank for each page within each location parameter (for example, people in new york search page-a would see our web page and people in san fran search page-a would see our web page)?
I hope this is clear.
Thanks,
Cole
-
Thanks all.
-
Sorry, I lost track of the fact that you were talking about dupe content on multiple domains, vs. on the same domain. The same logic basically applies. However, when you're talking about essentially duplicating entire domains registered to the same owner, there can be somewhat more of a risk that the original content gets discounted (or in such cases, penalized) along with the duplicate.
If you have a main site that seems to be doing OK in the search results, you may consider keeping that domain and it's content, while eliminating/redirecting the other domains and revising their content for use on the domain you're keeping.
-
Chris makes a fantastic point here.
You almost need to detach "what's reasonable" from what Google wants sometimes. Chris is right - why shouldn't those two pages have the same content? But we're dealing with algorithms mainly, not reasoning.
-
Cole,
I'm going to say roughly the same thing as the soon-to-be-guru Tom but give you somewhat of a different spin on it.
It's completely understandable that anyone with a website would feel that the the content applicable to one city would also apply to another city as well, so what's the harm in just switching out the city names? There shouldn't be really, and in most cases there is no actual harm, in it.
However, while Google's search engine makes it possible for customers in multiple cities to actually be able to seek out and find content you've "tailored" to them, it also makes it possible for other marketers to do the same as you've done--thus competition for keywords increases dramatically. On a small scale, google doesn't want to penalize, per se, a whole site for such practices, but it does want to differentiate that which might be original content from that which might be duplicates of the original and in doing so, be able to rank the original, while discounting duplicates.
To get around this "hurdle" you have to treat each of your pages as unique entities with unique values to each of your target markets. That way, content for each page ends up being unique and Google's algorithm can prioritize all the competitors' pages uniformly according to how relevant and valuable they are to the target audience.
-
Hey Cole
-
The more you do change, the less risk involved. Some might tell you that if you change the content enough to pass "copyscape" or other online plagiarism tools, that would protect you from a penalty. I find that to be slightly ridiculous - why would Google judge by those external standards? The more you can change, the better in my opinion (but I can totally sympathise with the work that entails)
-
Google will know you own the websites if you link them together, share GA code, host them together, contain the same company details and so on - but my question is why would you want to do that? I think if you tried to tell Google you owned all the sites they would come out you even harder, as they could see it as you being manipulative.
To that point, others will recommend that you only use one domain and target different KWs or locations on different pages/subfolders/subdomains, as it'll look less like a link network. Downside of that is getting Google local listings for each page/location can be a bit of a pain if the pages all come from one domain.
It's not really my place to comment on your strategy and what you should/should not be doing, but suffice to say if you go with individual domains for each location, you should aim to make those domains (and their copy) as unique and independent as possible.
-
-
Hey Tom,
The keywords we are competing for aren't very competitive.
Two follow up questions:
1.) To what length should we change the content? For example, is it a matter of a few words (location based) or is it more of altering each content on the page. I guess my question deals with the scope of the content change.
2.) Is there a way to let Google know we own all the websites? I had href lang in mind here. This may not be possible; I just wanted to ask.
Tom, thanks so much for your help.
Cole
-
Hi Cole
That kind of duplication will almost certainly negatively impact your ability to rank.
It's the kind of dupe content that Google hates - the kind that's deliberately manipulative and used by sites just trying to rank for as many different KWs or locations as possible, without trying to give people a unique user experience.
Not to say that you couldn't possibly rank like this (I've seen it happen and will probably see it again in the future), but you're leaving yourself wide open to a Panda penalty and, as such, I'd highly recommend that you cater each site and each landing page to your particular audience. Even by doing that, not only will you be making it unique but you would dramatically improve your chances of ranking by mentioning local things for a local page.
Give each page unique copy and really tailor it to your local audience.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I have duplicate content on my own website?
Hello Moz community, We are an agency providing services to various industries, and among them the hair salon industry. On our website, we have our different service pages in the main menu, as usual. These service pages are general information and apply to any industry.We also have a page on the website that is only intended for the hair salon industry. On this page, we would like to link new service pages: they will be the same services as our “general” services, but specialized for hair salons. My questions relate to duplicate content: Do we have to make the new individual service pages for hair salons with completely different text, even though it’s the same service, in order to avoid having duplicate content? Can we just change a few words from the “general service” page to specifically target hair salons, and somehow avoid Google seeing it as duplicate content? Reminder that these pages will be internal links inside of the hair salon industry page. Thank you in advance for your answers, Gaël
On-Page Optimization | | Gael_Regnault0 -
Duplicate Content in Footers (Not as routine as it seems)
Hello there, I know that content in the footer of sites are safe from duplication penalisation; however, what if the footers where replicated across different subdomains? For instance, the footer was duplicated across: www.example.com blog.example.com blog2.example.com I don't see it as a big issue personally; however, outsourced "specialists" seem to think that this is causing duplication problems and therefore negatively affecting the ranking power of "lesser" subdomains i.e. not the www version, which is by far the strongest subdomain. Would be good to get some insight if anybody has any. Thanks.
On-Page Optimization | | SEONOW1230 -
How to organise subpages for good SEO content without duplicate text?
We are working on many subpages for our services. We have original content for each page however there are few text which we need to always duplicate like: Contact sales window, why to choose us window, supported files etc. What's the best way to do this so it's not consider as duplicated text. Should we redirected it or add it as a picture and always change name of the picture? Thank you Lukas
On-Page Optimization | | Lukas-ST0 -
How do I fix duplicate page issue on Shopify with duplicate products because of collections.
I'm working with a new client with a site built on Shopify. Most of their products appear in four collections. This is creating a duplicate content challenge for us. Can anyone suggest specific code to add to resolve this problem. I'm also interested in other ideas solutions, such as "don't use collections" if that's the best approach. I appreciate your insights. Thank you!
On-Page Optimization | | quiltedkoala0 -
Duplicate content on partner site
I have a trade partner who will be using some of our content on their site. What's the best way to prevent any duplicate content issues? Their plan is to attribute the content to us using rel=author tagging. Would this be sufficient or should I request that they do something else too? Thanks
On-Page Optimization | | ShearingsGroup0 -
Duplicate Page Content Should we 301 - Best Practices?
What would be the best way to avoid a Duplicate Page Content for these type of pages. Our website generates user friendly urls, for each page..
On-Page Optimization | | 365ToursSafaris
So it is the same exact page, just both versions of the url work.. Example: http://www.safari365.com/about-africa/wildebeest-migration http://www.safari365.com/wildebeest-migration I don't think adding code to the page will work because its the same page for the incorrect and correct versions of the page. I don't think i can use the URL parameter setting because the version with /about-africa/ is the correct (correct as it it follows the site navigation) I was thinking of using the htaccess to redirect to the correct version.. Will that work ? and does it follow best Practices ? any other suggestions that would work better ?0 -
Duplicate Page Titles and Duplicate Content
I've been a Pro Member for nearly a year and I am bound and determined to finally clean up all the crawl errors on our site PracticeRange.com. We have 180 errors for Duplicate Page Titles and Duplicate Content. I fixed many of the pages that were product pages with duplicate content. Those product descriptions were edited and now have unique content. However, there remain plenty of the errors that are puzzling. Many of the errors reference the same pages, for example, the Home Page, Login Page and the Search page (our catalog pages).
On-Page Optimization | | AlanWills
In the case of the Catalog Page errors, these type pages would have the same title every time "Search" and the results differ according to category. http://www.practicerange.com/Search.aspx?m=6
http://www.practicerange.com/Search.aspx?m=15 If this is rel=canonical issue, how do I fix it on a search result page? I want each of the different category type pages to be indexed. One of them is no more important than the other. So how would I incorporate the rel=canonical? In the case of the Home Page errors, I'm really confused. I don't know where to start to fix these. They are the result of a 404 error that leads to the home page. Is the content of the 404 page the culprit since it contains a link to the home page? Here are examples of the Home Page type of crawl errors. http://www.practicerange.com/404.aspx?aspxerrorpath=/Golf-Training-Aids/Golf-Nets/~/Assets/ProductImages/products/Golf-Training-Aids/Rubber-Wooden-Tee-Holder.aspx http://www.practicerange.com/404.aspx?aspxerrorpath=/Golf-Training-Aids/Golf-Nets/~/Assets/ProductImages/products/Golf-Training-Aid/Impact-Bag.aspx Thanks , Alan WillsPracticeRange.com0 -
Recommended Length for a Companies "Services" Page Content
I am in the process of revamping my company's website. I do WordPress Development, Design, and SEO consulting, and i'm running into a sort of writer's block when wring my services pages. For example, my page on WordPress Security has 388 words of "body" content, and I feel from a content perspective, it serves it's purpose, but from an SEO perspective it is considered a little light. I really don't know what the SOP is here, because, I've literally seen competitors sites have a page on "WordPress Security" rank on the first page of Google with absolutely no content, an empty page. I see a lot of the Moz posts are huge, thosands of words, and I know they perform very well (and they also have ton's of links / PR...etc) and I just want to do the right thing. I know sites like http://www.seerinteractive.com/our-services/search-engine-optimization have relatively short info pages as well. Thanks in advance for your feedback. Zachary Russell President, ProTech Internet Group
On-Page Optimization | | Zachary_Russell0