Duplicate Content Question
-
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
-
Most plumbing companies just can't devote the resources to rebrand. What you said was a permutation of what I said, so I can't disagree. Though getting a smaller business to rebrand is terribly difficult.
We all know the benefits. But are there any tactics you know of that would help?
-
Yes, it's a pickle. Here's what we're dealing with:
- You need a new business name for the second business.
- Unless you completely change the business name so that it's appropriate for both locations
Now that I think about it, I like option #2. It's more work, but in the long run the branding benefit of having the same name recognized in both cities will likely pay off in some way.
Regardless, let's assume #2 isn't an option. I really want it to be, but let's say a name change is off the table. Then you need 2 business names, and presumably 2 websites.
In this case, duplicate content is likely going to hurt you. Each business should be unique. Because it's essentially Local SEO, you can rely on the quality of the citation building and unique reviews.
In theory, you could get away with it if all your other SEO signals were strong, but you still have the challenge of starting all that work from scratch, and maintaining two sites.
Still, I'd rather you change the business name for both businesses, and run the entire operation from a single site with multiple location listings.
-
I hear ya. I don't know if you're from the western NY area or not, but not living too far away from there, I know that 'Seneca' means more to people than just the tiny city in NY. The first thing that comes to mind to someone in Buffalo when they hear 'Seneca' is _not _the city.
This is turning into more of a discussion about marketing in general than a discussion about duplicate content.
This particular situation is unique. It's not a franchise. It's not a company expanding into a neighbouring city. In order to _not _change the name while expanding into this secondary market, an awful lot of money would have to be invested into branding, which just isn't worth it.
-
There's a Seneca Plumbing organic on the first page for 'buffalo plumbing'. It has to be worth something. Seneca is nearly two hours away, but they have a Buffalo address and what appears to be a local number.
If it really became an issue, they could possibly rebrand. It's expensive, I know, especially for the average plumbing company mktg budget. But if it becomes evident their current brand is harming their expansion, it's something they should think about now.
It wouldn't be too hard to redirect the old domain to the new branded domain. Even though exact match domains are going down as a ranking factor, it wouldn't hurt to be XYZplumber.com/buffalo.
-
I agree with all of that, but I feel this situation is very different. If you have a company called 'Albany Plumbers', it seems kind of weird to market 'Albany Plumbers' in Buffalo. That's basically what I'm dealing with. Client has the city name in their business name. Users are not going to think the site is relevant in this secondary market they're looking to target.
-
Look at it this way, what would be an easier/more effective use of your time and your client's budget? Would it be diverting your efforts from one site, in order to support two? Whenever this particular question is asked, the prevailing answer is more or less; "Don't dilute your efforts with multiple properties when you can better achieve your goal with one property."
So in general, you would want a page dedicated to that area, rather than rewriting the entire site. When it comes to your local listings, you can likely set a service area that encompasses the targets where possible.
You should be concerned about duplicate/thin content. Why go through all of that trouble to slightly rewrite a site, just to have it come up short? Plus there are other concerns, like the domain would be brand new - so you would have to get over that hump as well.
Local Landing Pages: A Guide to Great Implementation should be helpful. Mirium kind of, sort of, knows a lot about Local. : )
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL slash creating duplicate content
Hi All, I currently have an issue whereby by domain name (just homepage) has: mydomain.com and: mydomain.com/ Moz crawler flags this up as duplicate content - does anyone know of a way I can fix this? Thanks! Jack
Technical SEO | | Jack11660 -
Duplicate Content on a Page Due to Responsive Version
What are the implications if a web designer codes the content of the site twice into the page in order to make the site responsive? I can't add the url I'm afraid but the H1 and the content appear twice in the code in order to produce both a responsive version and a desktop version. This is a Wordpress site. Is Google clever enough to distinguish between the 2 versions and treat them individually? Or will Google really think that the content has been repeated on the same page?
Technical SEO | | Wagada0 -
Duplicate Title and Content. How to fix?
So this is the biggest error I have. But I don't know how to fix it. I get that I have to make it so that the duplicats redirect to the source, but I don't know how to do that. For example, this is out of our crawl diagnostic: | On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3 1 1 0 On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3&s=8d631e0ac09b7a462164132b60433f98 | 1 | 1 | 0 | That's just an example. But I have over 1000+ like that. How would I go about fixing that? Getting rid of the "&s=8d631e0ac09b7a462164132b60433f98"? I have godaddy as my domain and web hoster. Could they be able to fix it?
Technical SEO | | taychatha0 -
Duplicate content due to credit card testing
I recently launched a site - http://www.footballtriviaquestions.co.uk and the site uses Paypal. In order to test the PayPal functionality I set up a zapto.org domain via a permanent IP service that points directly to the computer I've written the website on. It appears that Google has now indexed the zapto.org website. Will this cause problems to my main website, as the zapto.org website will pretty much contain content that is an exact duplicate of what is held on the main website. I've looked in Google webmaster tools for the main website and it doesn't mention any duplicate content, but I'm currently not in the top 50 ranking for "football trivia questions' on Google despite SEOMoz ranking my home page with an A rating. The page does rank at position 16 in Yahoo and Bing. This seems odd to me, although I do have very few back links pointing to my site. If the duplicate content is likely to be causing me problems what would be the best way to knock the zapto.org results out of Google
Technical SEO | | ipr1010 -
Testing for duplicate content and title tags
Hi there, I have been getting both Duplicate Page content and Duplicate Title content warnings on my crawl diagnostics report for one of my campaigns. I did my research, and implemented the preferred domain setting in Webmaster Tools. This did not resolve the crawl diagnostics warnings, and upon further research I discovered the preferred domain would only be noted by Google and not other bots like Roger. My only issue was that when I ran an SEOmoz crawl test on the same domain, I saw none of the duplicate content or title warnings yet they still appear on my crawl diagnostics report. I have now implemented a fix in my .htaccess file to 301 redirect to the www. domain. I want to check if it's worked, but since the crawl test did not show the issue last time I don't think I can rely on that. Can you help please? Thanks, Claire
Technical SEO | | SEOvet0 -
Url rewrites / shortcuts - Are they considered duplicate content?
When creating a url rewrite or shortcut, does this create duplicate content issues? split your rankings / authority with google/search engines? Scenario 1 wwwlwhatthehellisahoneybooboo.com/dqotd/ -> www.whatthehellisahoneybooboo.com/08/12/2012/deep-questions-of-the-day.html Scenario 2 bitly.com/hbb -> www.whatthehellisahoneybooboo.com/08/12/2012/deep-questions-of-the-day.html (or to make it more compicated...directs to the above mentioned scenario 1 url rewrite) www.whatthehellisahoneybooboo.com/dqotd/ *note well- there's no server side access so mentions of optimizing .htacess are useless in this situation. To be clear, I'm only referring to rewrites, not redirects...just trying to understand the implications of rewrites. Thanks!
Technical SEO | | seosquared0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0