Duplicate Content, Same Company?
-
Hello Moz Community,
I am doing work for a company and they have multiple locations.
For example, examplenewyork.com, examplesanfrancisco.com, etc.
They also have the same content on certain pages within each website.
For example, examplenewyork.com/page-a has the same content as examplesanfrancisco.com/page-a
Does this duplicate content negatively impact us? Or could we rank for each page within each location parameter (for example, people in new york search page-a would see our web page and people in san fran search page-a would see our web page)?
I hope this is clear.
Thanks,
Cole
-
Thanks all.
-
Sorry, I lost track of the fact that you were talking about dupe content on multiple domains, vs. on the same domain. The same logic basically applies. However, when you're talking about essentially duplicating entire domains registered to the same owner, there can be somewhat more of a risk that the original content gets discounted (or in such cases, penalized) along with the duplicate.
If you have a main site that seems to be doing OK in the search results, you may consider keeping that domain and it's content, while eliminating/redirecting the other domains and revising their content for use on the domain you're keeping.
-
Chris makes a fantastic point here.
You almost need to detach "what's reasonable" from what Google wants sometimes. Chris is right - why shouldn't those two pages have the same content? But we're dealing with algorithms mainly, not reasoning.
-
Cole,
I'm going to say roughly the same thing as the soon-to-be-guru Tom but give you somewhat of a different spin on it.
It's completely understandable that anyone with a website would feel that the the content applicable to one city would also apply to another city as well, so what's the harm in just switching out the city names? There shouldn't be really, and in most cases there is no actual harm, in it.
However, while Google's search engine makes it possible for customers in multiple cities to actually be able to seek out and find content you've "tailored" to them, it also makes it possible for other marketers to do the same as you've done--thus competition for keywords increases dramatically. On a small scale, google doesn't want to penalize, per se, a whole site for such practices, but it does want to differentiate that which might be original content from that which might be duplicates of the original and in doing so, be able to rank the original, while discounting duplicates.
To get around this "hurdle" you have to treat each of your pages as unique entities with unique values to each of your target markets. That way, content for each page ends up being unique and Google's algorithm can prioritize all the competitors' pages uniformly according to how relevant and valuable they are to the target audience.
-
Hey Cole
-
The more you do change, the less risk involved. Some might tell you that if you change the content enough to pass "copyscape" or other online plagiarism tools, that would protect you from a penalty. I find that to be slightly ridiculous - why would Google judge by those external standards? The more you can change, the better in my opinion (but I can totally sympathise with the work that entails)
-
Google will know you own the websites if you link them together, share GA code, host them together, contain the same company details and so on - but my question is why would you want to do that? I think if you tried to tell Google you owned all the sites they would come out you even harder, as they could see it as you being manipulative.
To that point, others will recommend that you only use one domain and target different KWs or locations on different pages/subfolders/subdomains, as it'll look less like a link network. Downside of that is getting Google local listings for each page/location can be a bit of a pain if the pages all come from one domain.
It's not really my place to comment on your strategy and what you should/should not be doing, but suffice to say if you go with individual domains for each location, you should aim to make those domains (and their copy) as unique and independent as possible.
-
-
Hey Tom,
The keywords we are competing for aren't very competitive.
Two follow up questions:
1.) To what length should we change the content? For example, is it a matter of a few words (location based) or is it more of altering each content on the page. I guess my question deals with the scope of the content change.
2.) Is there a way to let Google know we own all the websites? I had href lang in mind here. This may not be possible; I just wanted to ask.
Tom, thanks so much for your help.
Cole
-
Hi Cole
That kind of duplication will almost certainly negatively impact your ability to rank.
It's the kind of dupe content that Google hates - the kind that's deliberately manipulative and used by sites just trying to rank for as many different KWs or locations as possible, without trying to give people a unique user experience.
Not to say that you couldn't possibly rank like this (I've seen it happen and will probably see it again in the future), but you're leaving yourself wide open to a Panda penalty and, as such, I'd highly recommend that you cater each site and each landing page to your particular audience. Even by doing that, not only will you be making it unique but you would dramatically improve your chances of ranking by mentioning local things for a local page.
Give each page unique copy and really tailor it to your local audience.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix duplicate content for homepage and index.html
Hello, I know this probably gets asked quite a lot but I haven't found a recent post about this in 2018 on Moz Q&A, so I thought I would check in and see what the best route/solution for this issue might be. I'm always really worried about making any (potentially bad/wrong) changes to the site, as it's my livelihood, so I'm hoping someone can point me in the right direction. Moz, SEMRush and several other SEO tools are all reporting that I have duplicate content for my homepage and index.html (same identical page). According to Moz, my homepage (without index.html) has PA 29 and index.html has PA 15. They are both showing Status 200. I read that you can either do a 301 redirect or add rel=canonical I currently have a 301 setup for my http to https page and don't have any rel=canonical added to the site/page. What is the best and safest way to get rid of duplicate content and merge the my non index and index.html homepages together these days? I read that both 301 and canonical pass on link juice but I don't know what the best route for me is given what I said above. Thank you for reading, any input is greatly appreciated!
On-Page Optimization | | dreservices0 -
How to explain to a client that duplicate content is bad...
Afternoon! An SEO client of ours has copied a load of landing/category page content from other sites. Lots of emails have been sent back and forth asking them to remove it, but they are adamant to keep it up there until we have time to amend it. We have explained to them: The Google penalty risks The copyright risks The short and long-term implications for their brand new business/website The money they are spending on our SEO package could be completely wasted if they're caught I think the above is pretty black and white, but the director of this company will not budge. Does anyone have any different approaches? The director said he's happy for us to amend the content but, in the meantime, the plagiarised content will not be removed. Cheers, Lewis
On-Page Optimization | | PeaSoupDigital0 -
Magento - How to avoid duplicate content on products that span different sites.
We have 4 Magento store fronts that operate out of the same backend. Is there any way to safely have products that span multiple stores without getting a duplicate content penalty? thanks!
On-Page Optimization | | Shop-Sq0 -
Site Duplicated despte redirect
Buon pormeriggio from I can smell Whaler Chips Through the window Wetherby,
On-Page Optimization | | Nightwing
When you Google Thakray Medical Museum 2 sites appear in the SERPS, yikes! Now the .org site is no longer hosted & point to the .co.uk site when clicked on but in a nutshell I wantto get rid of the .org site
as illustrated here: http://s216.photobucket.com/user/zymurgy_bucket/media/two-versions-same-website-yikes_zps182e6e12.jpg.html Actions taken so far:
1: Wembaster tools re index request for the .co.uk site
2: Redirect configured to point .org site to the .co.uk What else is left apart from updating the xml site but ultimating i do not want to see the the .org site but it doesnt exist (well id did a few month back but is no longer hosted so i am told) Any insights welcome,
GRazie tanto,
David0 -
E commerce Website canonical and duplicate content isssue
i have a ecomerce site , i am just wondering if any one could help me answer this the more info page can be access will google consider it as duplicate and if it does then how to best use the canonical tag http://domain.com/product-page http://domain.com/product-page/ http://domain.com/product-Page http://domain.com/product-Page/ also in zencart when link product it create duplicate page content how to tackle it? many thanks
On-Page Optimization | | conversiontactics0 -
What is the best way to manage industry required duplicate Important Safety Information (ISI) content on every page of a site?
Hello SEOmozzer! I have recently joined a large pharmaceutical marketing company as our head SEO guru, and I've encountered a duplicate content related issue here that I'd like some help on. Because there is so much red tape in the pharmaceutical industry, there are A LOT of limitations on website content, medication and drug claims, etc. Because of this, it is required to have Important Safety Information (ISI) clearly stated on every page of the client's website (including the homepage). The information is generally pretty lengthy, and in some cases is longer than the non-ISI content on each page. Here is an example: http://www.xifaxan.com/ All content under the ISI header is required on each page. My questions are: How will this duplicated content on each page affect our on-page optimization scores in the eyes of search engines? Is Google seeing this simply as duplicated content on every page, or are they "smart" enough to understand that because it is a drug website, this is industry standard (and required)? Aside from creating more meaty, non-ISI content for the site, are there any other suggestions you have for handling this potentially harmful SEO situation? And in case you were going to suggest it, we cannot simply have an image of the content, as it may not be visible by all internet users. We've already looked into that 😉 Thanks in advance! Dylan
On-Page Optimization | | MedThinkCommunications0 -
Numbers above actual site content
Most pages on my website contain many numbers above the actual text on the page. This is useful for users and looks good on an actual view of the page. However, when a bot reads the page it appears as rows of numbers with a few sentences at the bottom of the page. Does having these number have a negative SEO effect? If so, should I change them to something such as an image so they aren't readable by search engines?
On-Page Optimization | | theLotter0 -
Split testing and dupe content
Hi Everyone, good to be here. I'd like to do split testing in Adwords, currently with a clients site we are selling from a normal site with navigation. The site has about 5 specific products, I want to dupe one of the products and create a funnel without navigation distractions right to checkout. Then A/B test the same product pages in Adwords, one with nav and one without. Will the dupe content be ignored do you think? I'm only slightly concerned as the product pages rank well at the moment.
On-Page Optimization | | eonicWeb0