Duplicate Content - Local SEO - 250 Locations
-
Hey everyone,
I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc.
I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates.
So here's my question:
If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct?
Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations.
I really appreciate any insight!
Thank you,
-
** I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.**
If Google sees pages on your site that are substantially duplicate. It will filter all but one of them from the SERPs.
** is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.**
Yes. The reward is enormous. Ha.
-
Hey There!
In my view, the client has 2 options here:
-
Spring for unique content on the 250 site
-
Reconsider his decision about bringing everything into a single site. The question you've asked (can you really write about the identical service 250 times) is exactly why he should see his strategy is cumbersome. Ideally, you'd have a good handful of unique pages describing benefits of the service and would then have 250 semi-unique pages on the website, one for each physical location.
-
-
Hi SEO Team @ G5!
Since you are unable to create one large domain that houses all of the locations, I would attempt to make each of the websites as "unique" as possible. But keep in mind that unique content doesn't necessarily mean that you need to completely reword the content in different ways 250 times. Small changes can make a big difference.
There's a great (and short) video of Google's Matt Cutts talks about how Google handles duplicate content. There's also another helpful video about it here.
Matt Cutts has said, "Google looks for duplicate content and where we can find it, we often try to group it all together and treat it as of it’s just one piece of content. So most of the time, suppose we’re starting to return a set of search results and we’ve got two pages that are actually kind of identical. Typically we would say, “OK, rather than show both of those pages since they’re duplicates, let’s just show one of those pages and we’ll crowd the other result out,” and then if you get to the bottom of the search results and you really want to do an exhaustive search, you can change the filtering so that you can say, “OK, I want to see every single page” and then you’d see that other page. But for the most part, duplicate content isn’t really treated as spam. It’s just treated as something we need to cluster appropriately and we need to make sure that it ranks correctly, but duplicate content does happen."
Read more from this article here: https://searchenginewatch.com/sew/news/2319706/googles-matt-cutts-a-little-duplicate-content-wont-hurt-your-rankings
With this in mind, I do think your assumption is correct. If you make sure that any location that could be seen as competing areas has unique content, they won't necessarily be dinged for duplicated content. Unless you were trying to rank nationally, this shouldn't be a major problem for each individual website that is targeting a different location.
-
Thanks for your response. We would love to move to a single-domain, but unfortunately the client won't allow us to make that change.
I agree that ideally all 250 locations would have unique content, but I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.
Also, my other concern is; is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.
-
I would also make them into one big website.
But at the same time, I would have full unique content for each of the 250 locations. I know that sounds like a huge expense and a lot of work, but any company who has the resources to support 250 locations can support the small expense of unique content for each of them.
-
I completely understand where you are coming from, but I can only advise that you scrap all of the individual sites and make them into one big website. I know that sounds easier than it really is and there are most likely some complications that prevented them from doing it in the first place but it really is the best thing to do.
I do believe that the duplication will still matter, even if you only have one office/store in that location.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Business has multiple locations, but want to rank for commutable cities, geographies
Hello, The business I am working for has multiple locations, but the service they provide is one that you would commute for. At present, they have 20 or so pages with yucky geographical keyword stuffed content (think "New York computer services" and they are based out of a suburb (maybe 40 miles away). For some ridiculous reason, some of these pages are ranking for exact match search terms? We are in the process of revamping the whole site-taking approx five sites and integrating into one mega site. I want to first, figure out the best strategy for ranking for the region that each is in and serve, without being spammy like the previous SEO. I want to eliminate the spammy pages without losing the rank and link juice. What is the most appropriate and above-board strategy? These are my thoughts. Should I: 1. Keep the pages, but tweak them enough to make the content quality? If I do, should they be geo pages? Should they be "locations served", statistics of the area, etc? 2. Group the pages according to region (one page per region) that are location-oriented and tweaked to still include the terms they were ranking for (without the spammy look and stuffing), along with a map, etc? And then, I have to figure out how to redirect so not to lose the value we have now for some of them. The company deals with treatment for addiction, so in recommending and tips-remember that our audience will commute by car, and eventually (hopefully) by plane. 😉 Thank you so so much for any and all help you can provide! Sorry for such a long description!
Local Website Optimization | | lfrazer1231 -
Expanding to Other Geo Locations
Our company originally started in one city, now it is in multiple and the city we started in is actually now less important to our business than some of the new cities. We've of course have Google Places for Business listings for all our cities and are listed in the other prominent directories for each City (Bing Places, Manta, Superpages, etc etc) We have created once city page for each city in our domain. All this has improved our Local SERPs for those cities but they pale in comparison to our dominance in the city we started out in. We did have the first city in our home page title, we took that out. The obvious problem is from an SEO standpoint your home page is your "strongest" page but how do you make your home page rank top for multiple location intent searches: "{city} {target keyword}" ? The 3-pack is KEY. For example, for one city we make it into the local 3-pack but we are not in the organic SERPs on page 1 outside of the 3-pack. As far as I can tell the major factor in 3-pack ranking is of course proximity of business to the user's location or user's location intent. I would say followed by the natural ranking factors (or at least a large subset of them) that Google uses for its normal organic rankings, followed by Google Places reviews. You would think the Google places reviews really make a difference, but not as much as you think. So how do you dominate local searches in different cities when competing against local-only companies? My only guess is you need to create as much content as possible. You don't want to make micro sites I think as you lose all the link juice going to your main site. But how much content can one make that isn't duplicative. You can describe the same products and services over and over for each city but that's not useful nor wise. I guess you could do some re-writing. But other than a different address, phone, and staff members, if your service is identical for each city it doesn't leave a lot of room for useful content creation to improve local search SERPS. I guess this begs the overall question, can a multi-city company ever dominate local SERPS when the search has a location intent (city name in the search) it there is even just a couple competing local companies doing some SEO work. it seems it is an extreme uphill battle if not next to impossible. (Never say never.)
Local Website Optimization | | Searchout1 -
Franchise Content Spinning
Hey Guys, Thanks for taking the time out to read my question, I appreciate it. I know Google doesn't treat all duplicate content the same, but what about this scenario. We have a garage door company franchise that services Seattle, San Diego, & Salt Lake City. It is the same brand, but each area has a different website, catering to their own county. Say I write & post a blog about "how to maintain your garage door" to the Seattle site. This is certainly useful for the other locations as well. So would I get penalized for posting the same article to San Diego & Salt Lake City without massively changing the content to avoid duplication? Or should I dedicate the extra time to revamp the content and avoid duplication? Does Google care about this type of duplication? Thanks in advance!!
Local Website Optimization | | dwayne.jones260 -
Does multiple sites that relate to one company hurt seo
I know this has been asked and answered but my situation is a little different. I am a local electrical contractor. I specialize in a service and not a product. Competition is high in the local market due to the other electrical contractors that have well seasoned sites with very good DA/PA. Although new to the web I am not new to the trade. Throughout years almost back to the AOL dialup days I have been collecting domain names for this particular purpose. Now I want to put them to good use. Being an electrical contractor, there are many different facets of work and services we provide. My primary site is empireelec.com A second site I threw online overnight with minimal content is jacksonvillelightingrepair.com. Although it is a fresh site, there is minimal content and I have put almost zero effort in to it. It appears to be ranking for keywords a lot quicker. That leads me to believe I should utilize my other domain jacksonvillefloridaelectrician.com and target just the keyword Jacksonville Florida Electrician. It leads me to believe I should use jacksonvillebeachelectrician.com for targeting electricians in jacksonville beach. And again with jacksonvilleelectricianservice.com I can provide a unique phone number for each site. Am I going about this all wrong? Everything I read says no,no,no but I feel my situation is a little more unique.
Local Website Optimization | | empireelec1 -
A question about similar services a multiple locations
Moz Friends, I hope you can help with this question. My company has 25 locations, and growing. Our rankings are strong in the Serps and Local Maps. With each location, we create a new page (with a unique URL) for that specific location (ex: Thriveworks.com/knoxville-counseling). We then write about 15 pages of unique content for that location, each page about one of the services we provide like: Depression Counseling, Couples Therapy, Anger Management, Eating Disorder Treatment, Life Coaching, Child Therapy, and the list goes on and on.... Hence, for each location, we create a pile of URLS like: Thriveworks.com/knoxville-counseling/couples-therapy, ..../knoxville-counseling/depression-therapy, .../knoxville-counseling/anger-management... We do this to rank for medium-long-tail searches like "Knoxville Marriage Therapy." As we grow, this results in us writing lots and lots of original content for each location. Original, but somewhat redundant. We would much rather write one AMAZING article on depression counseling, than 25 'okay' ones for each office we open. So, my question (if you're still reading) is our current approach the right one? Should we continue the grind and for each location create a unique page for each service offered out of that office? Or is there a better way, where we can create One anger management page that would suffice for each of our local offices? Has anyone addressed this topic in an article? I Haven't found one... I look forward to your feedback, and thanks in advance!!
Local Website Optimization | | Thriveworks-Counseling0 -
Hiring an SEO Company
I am looking to hire an SEO company each have there own ideas and strengths. My concerns are what is good and what is bad. Here is one company where their Silver Package fits within our budget. But there are a lot of features on here we do know if is it good or bad SEO. I attached the packages they send us. If we were to hire an SEO Company to do our offsite SEO, what should we be looking for that is considered whitehat seo for 2015? zCJowNb
Local Website Optimization | | TIM_DOTCOM0 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0 -
Short EMD or Longer Partial MD: Which is better for SEO
Hey Guys, I appreciate all of the amazing responses you have been posting over the last two days.
Local Website Optimization | | Web3Marketing87
I have been a little tied up with Joomla 1.5 transfers, but I will make some comments soon! In the meantime, here is an interesting one: I have an existing domain with a little bit of DA - www.edmontonweb.ca as well as a parked domain with no DA - launchwebdesign.ca I have been advised to redirect (301) to the launch domain, but I still wonder if the current domain is better - after all it is very short & was registered over 5 years ago. Since launching the new site it has been about a month. We are currently #1 for almost every term on Bing & Yahoo (web design, edmonton web design etc.) but 14th on Google for "edmonton web design." Do you think switching the domain is a good call, or keep trying with edmontonweb.ca for a bit longer? Thanks guys, Anton0