Duplicate Content, Same Company?
-
Hello Moz Community,
I am doing work for a company and they have multiple locations.
For example, examplenewyork.com, examplesanfrancisco.com, etc.
They also have the same content on certain pages within each website.
For example, examplenewyork.com/page-a has the same content as examplesanfrancisco.com/page-a
Does this duplicate content negatively impact us? Or could we rank for each page within each location parameter (for example, people in new york search page-a would see our web page and people in san fran search page-a would see our web page)?
I hope this is clear.
Thanks,
Cole
-
Thanks all.
-
Sorry, I lost track of the fact that you were talking about dupe content on multiple domains, vs. on the same domain. The same logic basically applies. However, when you're talking about essentially duplicating entire domains registered to the same owner, there can be somewhat more of a risk that the original content gets discounted (or in such cases, penalized) along with the duplicate.
If you have a main site that seems to be doing OK in the search results, you may consider keeping that domain and it's content, while eliminating/redirecting the other domains and revising their content for use on the domain you're keeping.
-
Chris makes a fantastic point here.
You almost need to detach "what's reasonable" from what Google wants sometimes. Chris is right - why shouldn't those two pages have the same content? But we're dealing with algorithms mainly, not reasoning.
-
Cole,
I'm going to say roughly the same thing as the soon-to-be-guru Tom but give you somewhat of a different spin on it.
It's completely understandable that anyone with a website would feel that the the content applicable to one city would also apply to another city as well, so what's the harm in just switching out the city names? There shouldn't be really, and in most cases there is no actual harm, in it.
However, while Google's search engine makes it possible for customers in multiple cities to actually be able to seek out and find content you've "tailored" to them, it also makes it possible for other marketers to do the same as you've done--thus competition for keywords increases dramatically. On a small scale, google doesn't want to penalize, per se, a whole site for such practices, but it does want to differentiate that which might be original content from that which might be duplicates of the original and in doing so, be able to rank the original, while discounting duplicates.
To get around this "hurdle" you have to treat each of your pages as unique entities with unique values to each of your target markets. That way, content for each page ends up being unique and Google's algorithm can prioritize all the competitors' pages uniformly according to how relevant and valuable they are to the target audience.
-
Hey Cole
-
The more you do change, the less risk involved. Some might tell you that if you change the content enough to pass "copyscape" or other online plagiarism tools, that would protect you from a penalty. I find that to be slightly ridiculous - why would Google judge by those external standards? The more you can change, the better in my opinion (but I can totally sympathise with the work that entails)
-
Google will know you own the websites if you link them together, share GA code, host them together, contain the same company details and so on - but my question is why would you want to do that? I think if you tried to tell Google you owned all the sites they would come out you even harder, as they could see it as you being manipulative.
To that point, others will recommend that you only use one domain and target different KWs or locations on different pages/subfolders/subdomains, as it'll look less like a link network. Downside of that is getting Google local listings for each page/location can be a bit of a pain if the pages all come from one domain.
It's not really my place to comment on your strategy and what you should/should not be doing, but suffice to say if you go with individual domains for each location, you should aim to make those domains (and their copy) as unique and independent as possible.
-
-
Hey Tom,
The keywords we are competing for aren't very competitive.
Two follow up questions:
1.) To what length should we change the content? For example, is it a matter of a few words (location based) or is it more of altering each content on the page. I guess my question deals with the scope of the content change.
2.) Is there a way to let Google know we own all the websites? I had href lang in mind here. This may not be possible; I just wanted to ask.
Tom, thanks so much for your help.
Cole
-
Hi Cole
That kind of duplication will almost certainly negatively impact your ability to rank.
It's the kind of dupe content that Google hates - the kind that's deliberately manipulative and used by sites just trying to rank for as many different KWs or locations as possible, without trying to give people a unique user experience.
Not to say that you couldn't possibly rank like this (I've seen it happen and will probably see it again in the future), but you're leaving yourself wide open to a Panda penalty and, as such, I'd highly recommend that you cater each site and each landing page to your particular audience. Even by doing that, not only will you be making it unique but you would dramatically improve your chances of ranking by mentioning local things for a local page.
Give each page unique copy and really tailor it to your local audience.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
CTA first content next or Content first CTA next
We are a casino affiliations company, our website has a lot of the same casino offers. So is it beneficial to put the content over the casino offers, then do a CSS flex, reverse wrap, so the HTML has the page content first, but the visual of the page displays the casinos first and the content after? or just the usual i.e image the HTML as content first, and CSS makes offers come first?
On-Page Optimization | | JoelssonMedia0 -
Duplicate content in Shopify reported by Moz
According to Moz crawl report, there are hundreds of duplicate pages in our Shopify store ewatchsale.com. The main duplicate pages are:
On-Page Optimization | | ycnetpro101
https://ewatchsale.com/collections/seiko-watches?page=2
https://ewatchsale.com/collections/all/brand_seiko
(the canonical page should be https://ewatchsale.com/collections/seiko-watches) https://ewatchsale.com/collections/seiko-watches/gender_mens
(the canonical page should be https://ewatchsale.com/collections/seiko-watches/mens-watches) Also, I want to exclude indexing of pages URLs with "filter parameters" like https://ewatchsale.com/collections/seiko-watches/color_black+mens-watches+price_us-100-200 Shopify advised we can't access our robots.txt file. How can we exclude SE crawling of the page URLs with filter names?
How can we access the robots.txt file?
How can we add canonical code to the preferred collection pages? Which templates and what codes to add? Thanks for your advice in advance!0 -
Duplicate content, is it ever ok?
I am building a large site for a client who sells physical products. I am using WordPress as my CMS (as a piece of background information). There are a few products that need to be listed in the sites hierarchy in multiple locations as such: Parent A Child 1 Parent B Child 2 Child 3 Parent C Child 1 I am concerned that having a product exist in multiple instances will cause indexing problems for that product. I can't be the only person to come across this issue, would love some feedback on the best practices for such an issue. Thanks in advance
On-Page Optimization | | Qcmny0 -
Using a lightbox - possible duplicate content issues
Redesigning website in Wordpress and going to use the following lightbox plug-in http://www.pedrolamas.pt/projectos/jquery-lightbox/ Naming the original images that appear on screen as say 'sweets.jpg'
On-Page Optimization | | Jon-C
and the bigger version of the images as 'sweets-large.jpg' Alt text wise I would give both versions of the images slightly different descriptions. Do you think there would be any duplicate content issues with this? Anything I should do differently? I'm very wary of doing anything that Google is likely to think is naughty, so want to stay on their good side! Cheers
T0 -
Duplicate content
Hello, I have two pages showing dulicate content. They are: http://www.cedaradirondackchairs.net/ http://www.cedaradirondackchairs.net/index Not sure how to resolve this issue. Any help would be greatly appreciated! Thanks.
On-Page Optimization | | Ronb10230 -
Exponentially Increasing Duplicate Content On Blogs
Most of the clients that I pick up are either new to SEO best practices, or have worked with sketchy SEO providers in the past, who did little more than build spammy links. Most of them have deployed little if any on-site SEO best practices, and early on I spend a lot of time fixing canonical and duplicate content issues alla 301 redirects. Using SEOMOZ, however, I see a lot of duplicate content issues with blogs that live on the sites I work on. With every new blog article we publish, more duplicate content builds up. I feel like duplicate content on blogs grows exponentially, because every time you write a blog article, it exists provisionally on the blog homepage, the article link, a category page, maybe a tag page, and an author page. I have a two-part question: Is duplicate content like this a problem for a blog -- and for the website that the blog lives on? Are search engines able to parse out that this isn't really duplicate content? If it is a problem, how would you go about solving it? Thanks in advance!
On-Page Optimization | | RCNOnlineMarketing0 -
Duplicate Content - Meta Data for International Site Roll Out
Hi All, We have a site targeting Ireland, so all on-page SEO is completed and launched on the Irish site. We are now rolling out this site to the UK...how much of this content & SEO meta data has to be changed for Google to not recognise it as duplicate content? Site structure is as follows: http://www.domain.com/ie-en/ - Irish site http://www.domain.com/uk-en/ - UK site Or will it even be considered duplicate content as we have the uk and Irish signals in the subfolders, will be using geo targeting on webmasters, and will have UK specific addresses and phone numbers? We will be rolling this site out to may more countries so would be great to get this straight from the start so we don't waste time creating many versions of the meta data unnecessarily! Many thanks Emma
On-Page Optimization | | john_Digino0 -
Archetecture to avoid content duplicate
Hi, I have lots of duplicate stuff and I need a better site architecture. http://www.furnacefilterscanada.com/ We are selling furnace filters. All furnace filters are sold in 50 different sizes, each sizes comes in 3 different qualities, Bronze, Silver and Gold. Total: 150 products. Right now I have created many categories and subcategories for furnace filters sizes. When the client pickup is sizes, he will end-up to the products page with 3 different options, Bronze, Silver and Gold. They can then compare the filter a select the one he wants to purchase. The problem is, it is not possible to provide different content for each filters, Gold has a description, Silver has another one and also Bronze. The only text that will change in the descriptions, is the filter size. This makes Duplicates text description. Not good when you what to index your site. The positive things to 150 different products, is the page title. example 16x25x4 furnace filters. Those exacte tem get search in Google. A new site architecture with 3 categories, Gold, Silver and Bronze & 50 variables by products (filters sizes) might not be the best options, because no filter size will be index. Can you please help me to find the best architecture in a SEO point of view? Also what about the top navigation bar menu, what is the best options in using it? Right now it is use for Legal, Contact, Policy and I fill it is a wast, those page only get less then 1% clicks. It might be more convenient to use those for categories for example, what is your recommendations in a SEO point of view? Can I create a information page in the left navigation menu and includ all the standard page, like: Policy, Legal ... If I do, will I get penalize by Google? Thank you for your help. We have puts lots of money in AdWords before, but now the next step is to come home organics. I'm using SEOmoz tools, read there new book, and I want increase traffic. I just need your help. Thank you, BigBlaze
On-Page Optimization | | BigBlaze2050