Duplicate Content - Local SEO - 250 Locations
-
Hey everyone,
I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc.
I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates.
So here's my question:
If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct?
Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations.
I really appreciate any insight!
Thank you,
-
** I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.**
If Google sees pages on your site that are substantially duplicate. It will filter all but one of them from the SERPs.
** is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.**
Yes. The reward is enormous. Ha.
-
Hey There!
In my view, the client has 2 options here:
-
Spring for unique content on the 250 site
-
Reconsider his decision about bringing everything into a single site. The question you've asked (can you really write about the identical service 250 times) is exactly why he should see his strategy is cumbersome. Ideally, you'd have a good handful of unique pages describing benefits of the service and would then have 250 semi-unique pages on the website, one for each physical location.
-
-
Hi SEO Team @ G5!
Since you are unable to create one large domain that houses all of the locations, I would attempt to make each of the websites as "unique" as possible. But keep in mind that unique content doesn't necessarily mean that you need to completely reword the content in different ways 250 times. Small changes can make a big difference.
There's a great (and short) video of Google's Matt Cutts talks about how Google handles duplicate content. There's also another helpful video about it here.
Matt Cutts has said, "Google looks for duplicate content and where we can find it, we often try to group it all together and treat it as of it’s just one piece of content. So most of the time, suppose we’re starting to return a set of search results and we’ve got two pages that are actually kind of identical. Typically we would say, “OK, rather than show both of those pages since they’re duplicates, let’s just show one of those pages and we’ll crowd the other result out,” and then if you get to the bottom of the search results and you really want to do an exhaustive search, you can change the filtering so that you can say, “OK, I want to see every single page” and then you’d see that other page. But for the most part, duplicate content isn’t really treated as spam. It’s just treated as something we need to cluster appropriately and we need to make sure that it ranks correctly, but duplicate content does happen."
Read more from this article here: https://searchenginewatch.com/sew/news/2319706/googles-matt-cutts-a-little-duplicate-content-wont-hurt-your-rankings
With this in mind, I do think your assumption is correct. If you make sure that any location that could be seen as competing areas has unique content, they won't necessarily be dinged for duplicated content. Unless you were trying to rank nationally, this shouldn't be a major problem for each individual website that is targeting a different location.
-
Thanks for your response. We would love to move to a single-domain, but unfortunately the client won't allow us to make that change.
I agree that ideally all 250 locations would have unique content, but I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.
Also, my other concern is; is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.
-
I would also make them into one big website.
But at the same time, I would have full unique content for each of the 250 locations. I know that sounds like a huge expense and a lot of work, but any company who has the resources to support 250 locations can support the small expense of unique content for each of them.
-
I completely understand where you are coming from, but I can only advise that you scrap all of the individual sites and make them into one big website. I know that sounds easier than it really is and there are most likely some complications that prevented them from doing it in the first place but it really is the best thing to do.
I do believe that the duplication will still matter, even if you only have one office/store in that location.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Title Too Long - Seo importance
Hey; What do you think Title Too Long for Seo importance my site adress almost all page Title Too Long warning is showing http://prntscr.com/ndykgw from many sites this way, how much difference does this order make?
Local Website Optimization | | mesutcandan0 -
Business has multiple locations, but want to rank for commutable cities, geographies
Hello, The business I am working for has multiple locations, but the service they provide is one that you would commute for. At present, they have 20 or so pages with yucky geographical keyword stuffed content (think "New York computer services" and they are based out of a suburb (maybe 40 miles away). For some ridiculous reason, some of these pages are ranking for exact match search terms? We are in the process of revamping the whole site-taking approx five sites and integrating into one mega site. I want to first, figure out the best strategy for ranking for the region that each is in and serve, without being spammy like the previous SEO. I want to eliminate the spammy pages without losing the rank and link juice. What is the most appropriate and above-board strategy? These are my thoughts. Should I: 1. Keep the pages, but tweak them enough to make the content quality? If I do, should they be geo pages? Should they be "locations served", statistics of the area, etc? 2. Group the pages according to region (one page per region) that are location-oriented and tweaked to still include the terms they were ranking for (without the spammy look and stuffing), along with a map, etc? And then, I have to figure out how to redirect so not to lose the value we have now for some of them. The company deals with treatment for addiction, so in recommending and tips-remember that our audience will commute by car, and eventually (hopefully) by plane. 😉 Thank you so so much for any and all help you can provide! Sorry for such a long description!
Local Website Optimization | | lfrazer1231 -
How accurate are google keyword estimates for local search volume?
We've all used the Google Adwords Keywords Tool, and if you're like me you use it to analyze data for a particular region. Does anyone know how accurate this data is? For example, I'd like to know how often people in Savannah, Georgia search for the word "forklift". I figure that Google can give me two kinds of data when I ask for how many people in Savannah search for "forklift". They might actually give me rough data for how many people in the region actually searched for the term "forklift" over the last 12 months, then divide by 12 to give me a monthly average. Or they might use data on a much broader region and then adjust for Savannah's population size. In other words, they might say, in the US people searched for "forklift" and average of 1,000,000 times a month. The US has a population of 300,000,000. Savannah has a population of about 250,000. 250,000 / 300,000,000 is 0.00083. 1,000,000 times 0.00083 is 208. So, "forklift" is searched in Savannah an average of 208 times. 1. is obviously much more accurate. I suspect that 2. is the model that Google is actually using. Does anyone know with reasonable certainty which it is? Thanks,
Local Website Optimization | | aj613
Adam0 -
In local SEO, how important is it to include city, state, and state abbreviation in doctitle?
I'm trying to balance local geographic keywords with product keywords. I appreciate the feedback from the group! Michael
Local Website Optimization | | BFMichael0 -
No Index, No Follow Short *but relevant) content?
One of the sections of our blog is "Community Involvement." In this section, we post pictures of the event, what it was for, and what we did to help. We want our clients, and potential clients, to see that we do give back to our local community. However, thee are all very short posts (maybe a few hundred words). I'm worried this might look like spam, or at the very least, thin content to google, so should I no index no follow the posts or just leave them as is? Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0 -
Using IP Detection to Filter Directory Listings without Killing Your SEO?
I have a client who maintains a directory of surgeons across the United States (approx. 2,000 members at present), and wishes to use IP detection to dynamically filter their surgeon directory to a sub-set that is relevant to the geography of the visitor. At the same time, however, we want the pages in the surgeon directory to rank nationally for terms like "[insert specialty] surgeons". Any tips/best practices for implementing an IP detection solution without shooting yourself in the foot from an SEO perspective? Is it even possible? Thanks! Jeremy
Local Website Optimization | | Jeremy_Lopatin0 -
Understand how site redesign impacts SEO
Hi everyone, I have, what I think, is kind of a specific question, but hoping you guys can help me figure out what to do. I have a client that recently changed their entire website (I started working with them after it happened, so I can't comment on what the site was like as far as content was before). I know they were using a service that I see a lot of in the service industry that aim to capitalize on local business (i.e. "leads nearby" or "nearby now") by creating pages for each targeted city and I believe collecting reviews for each city directly on the website. When they redesigned their website, they dropped that service and now all those pages that were ranking in SERPs are coming back as 404s because they are not included in the new site (I apologize if this is getting confusing!) The site that they moved to is a template site that they purchased the rights to from an already successful company in their same industry, so I do think the link structure probably changed, especially with all of the local pages that are no longer available on the site. Note: I want to use discretion in using company names, but happy to share more info in a private message if you'd like to see the sites I am talking about as I have a feeling that this is getting confusing 🙂 Has anyone had experience with something like this? I am concerned because even though I am targeting the keywords being used previously to direct content to the local pages to new existing pages, traffic to the website has dropped by nearly 60% and I know my clients are going to want answers-- and right now, I only have guesses. I am really looking forward to and so greatly appreciate any advice you might be able to share, I'm at a bit of a loss right now.
Local Website Optimization | | KaitlinNS0 -
What is the Best Keyword Placement within a URL for Inner Location Pages?
I'm working on a website with 100s of locations. There is a location search page (Find Widget Dealer), a page for each state (Tennessee Widget Dealers) and finally a page for each individual location which has localized unique content and contact info (Nashville Widget Dealer). My question is is related to how I should structure my URL and the keywords within the URL. Keywords in my examples being the location and the product (i.e. widget). Here is a quick overview of each of the 3 tiered pages, with the Nashville page being the most optimized: Find Widget Dealer - Dealer Page only includes a location search bar and bullet list links to states Tennessee Widget Dealers - Page includes brief unique content for the the state and basic listing info for each location along with links to the local page) Nashville Widget Dealer - Page includes a good amount of unique content for this specific location (Most optimized page) That said, here are the 3 URL structure options I am considering: http://website.com/widget-dealers/tennesee/nashville http://website.com/dealers/tennesee-widget-dealers/nashville http://website.com/dealers/tennesee/nashville-widget-dealer Any help is appreciated! Thank you
Local Website Optimization | | the-coopersmith0