Duplicate Content Question
-
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
-
Most plumbing companies just can't devote the resources to rebrand. What you said was a permutation of what I said, so I can't disagree. Though getting a smaller business to rebrand is terribly difficult.
We all know the benefits. But are there any tactics you know of that would help?
-
Yes, it's a pickle. Here's what we're dealing with:
- You need a new business name for the second business.
- Unless you completely change the business name so that it's appropriate for both locations
Now that I think about it, I like option #2. It's more work, but in the long run the branding benefit of having the same name recognized in both cities will likely pay off in some way.
Regardless, let's assume #2 isn't an option. I really want it to be, but let's say a name change is off the table. Then you need 2 business names, and presumably 2 websites.
In this case, duplicate content is likely going to hurt you. Each business should be unique. Because it's essentially Local SEO, you can rely on the quality of the citation building and unique reviews.
In theory, you could get away with it if all your other SEO signals were strong, but you still have the challenge of starting all that work from scratch, and maintaining two sites.
Still, I'd rather you change the business name for both businesses, and run the entire operation from a single site with multiple location listings.
-
I hear ya. I don't know if you're from the western NY area or not, but not living too far away from there, I know that 'Seneca' means more to people than just the tiny city in NY. The first thing that comes to mind to someone in Buffalo when they hear 'Seneca' is _not _the city.
This is turning into more of a discussion about marketing in general than a discussion about duplicate content.
This particular situation is unique. It's not a franchise. It's not a company expanding into a neighbouring city. In order to _not _change the name while expanding into this secondary market, an awful lot of money would have to be invested into branding, which just isn't worth it.
-
There's a Seneca Plumbing organic on the first page for 'buffalo plumbing'. It has to be worth something. Seneca is nearly two hours away, but they have a Buffalo address and what appears to be a local number.
If it really became an issue, they could possibly rebrand. It's expensive, I know, especially for the average plumbing company mktg budget. But if it becomes evident their current brand is harming their expansion, it's something they should think about now.
It wouldn't be too hard to redirect the old domain to the new branded domain. Even though exact match domains are going down as a ranking factor, it wouldn't hurt to be XYZplumber.com/buffalo.
-
I agree with all of that, but I feel this situation is very different. If you have a company called 'Albany Plumbers', it seems kind of weird to market 'Albany Plumbers' in Buffalo. That's basically what I'm dealing with. Client has the city name in their business name. Users are not going to think the site is relevant in this secondary market they're looking to target.
-
Look at it this way, what would be an easier/more effective use of your time and your client's budget? Would it be diverting your efforts from one site, in order to support two? Whenever this particular question is asked, the prevailing answer is more or less; "Don't dilute your efforts with multiple properties when you can better achieve your goal with one property."
So in general, you would want a page dedicated to that area, rather than rewriting the entire site. When it comes to your local listings, you can likely set a service area that encompasses the targets where possible.
You should be concerned about duplicate/thin content. Why go through all of that trouble to slightly rewrite a site, just to have it come up short? Plus there are other concerns, like the domain would be brand new - so you would have to get over that hump as well.
Local Landing Pages: A Guide to Great Implementation should be helpful. Mirium kind of, sort of, knows a lot about Local. : )
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content from long Site Title
Hello! I have a number of "Duplicate Title Errors" as my website has a long Site Title: Planit NZ: New Zealand Tours, Bus Passes & Travel Planning. Am I better off with a short title that is simply my website/business name: Planit NZ My thought was adding some keywords might help with my rankings. Thanks Matt
Technical SEO | | mkyhnn0 -
.com and .co.uk duplicate content
hi mozzers I have a client that has just released a .com version of their .co.uk website. They have basically re-skinned the .co.uk version with some US amends so all the content and title tags are the same. What you do recommend? Canonical tag to the .co.uk version? rewrite titles?
Technical SEO | | KarlBantleman0 -
Magento Duplicate Content help!
How can I remove the duplicate page content in my Magento store from being read as duplicate. I added the Magento robots file that i have used on many stores and it keeps giving us errors. Also we have enabled the canonical links in magento admin I am getting 3616 errors and can't seem to get around it .. any suggestions?
Technical SEO | | adamxj20 -
Duplicate content or Duplicate page issue?
Hey Moz Community! I have a strange case in front of me. I have published a press release on my client's website and it ranked right away in Google. A week after the page completely dropped and it completely disappeared. The page is being indexed in Google, but when I search "title of the PR", the only results I get for that search query are the media and news outlets that have reported the news. No presence of my client's page. I also have to mention that I found two URLs of the same page: one with lower case letters and one with capital letters. Is this a duplicate page or a duplicate content issue coming from the news websites? How can I solve it? Thanks!
Technical SEO | | Workaholic0 -
Duplicate Page Content for sorted archives?
Experienced backend dev, but SEO newbie here 🙂 When SEOmoz crawls my site, I get notified of DPC errors on some list/archive sorted pages (appending ?sort=X to the url). The pages all have rel=canonical to the archive home. Some of the pages are shorter (have only one or two entries). Is there a way to resolve this error? Perhaps add rel=nofollow to the sorting menu? Or perhaps find a method that utilizes a non-link navigation method to sort / switch sorted pages? No issues with duplicate content are showing up on google webmaster tools. Thanks for your help!
Technical SEO | | jwondrusch0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Is there ever legitimate near duplicate content?
Hey guys, I’ve been reading the blogs and really appreciate all the great feedback. It’s nice to see how supportive this community is to each other. I’ve got a question about near duplicate content. I’ve read a bunch of great post regarding what is duplicate content and how to fix it. However, I’m looking at a scenario that is a little different from what I’ve read about. I’m not sure if we’d get penalized by Google or not. We are working with a group of small insurance agencies that have combined some of their back office work, and work together to sell the same products, but for the most part act as what they are, independent agencies. So we now have 25 different little companies, in 25 different cities spread across the southeast, all selling the same thing. Each agency has their own URL, each has their own Google local places registration, their own backlinks to their local chambers, own contact us and staff pages, etc. However, we have created landing pages for each product line, with the hopes of attracting local searches. While we vary each landing page a little per agency (the auto insurance page in CA talks about driving down the 101, while the auto insurance page in Georgia says welcome to the peach state) probably 75% of the land page content is the same from agency to agency. There is only so much you can say about specific lines of insurance. They have slightly different titles, slightly different headers, but the bulk of the page is the same. So here is the question, will Google hit us with a penalty for having similar content across the 25 sites? If so, how do you handle this? We are trying to write create content, and unique content, but at the end of the day auto insurance in one city is pretty much the same as in another city. Thanks in advance for your help.
Technical SEO | | mavrick0 -
Aspx filters causing duplicate content issues
A client has a url which is duplicated by filters on the page, for example: - http://www.example.co.uk/Home/example.aspx is duplicated by http://www.example.co.uk/Home/example.aspx?filter=3 The client is moving to a new website later this year and is using an out-of-date Kentico CMS which would need some development doing to it in order to enable implementation of rel canonical tags in the header, I don't have access to the server and they have to pay through the nose everytime they want the slightest thing altering. I am trying to resolve this duplicate content issue though and am wondering what is the best way to resolve it in the short term. The client is happy to remove the filter links from the page but that still leaves the filter urls in Google. I am concerned that a 301 redirect will cause a loop and don't understand the behaviour of this type of code enough. I hope this makes sense, any advice appreciated.
Technical SEO | | travelinnovations0