Duplicate Content - Local SEO - 250 Locations
-
Hey everyone,
I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc.
I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates.
So here's my question:
If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct?
Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations.
I really appreciate any insight!
Thank you,
-
** I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.**
If Google sees pages on your site that are substantially duplicate. It will filter all but one of them from the SERPs.
** is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.**
Yes. The reward is enormous. Ha.
-
Hey There!
In my view, the client has 2 options here:
-
Spring for unique content on the 250 site
-
Reconsider his decision about bringing everything into a single site. The question you've asked (can you really write about the identical service 250 times) is exactly why he should see his strategy is cumbersome. Ideally, you'd have a good handful of unique pages describing benefits of the service and would then have 250 semi-unique pages on the website, one for each physical location.
-
-
Hi SEO Team @ G5!
Since you are unable to create one large domain that houses all of the locations, I would attempt to make each of the websites as "unique" as possible. But keep in mind that unique content doesn't necessarily mean that you need to completely reword the content in different ways 250 times. Small changes can make a big difference.
There's a great (and short) video of Google's Matt Cutts talks about how Google handles duplicate content. There's also another helpful video about it here.
Matt Cutts has said, "Google looks for duplicate content and where we can find it, we often try to group it all together and treat it as of it’s just one piece of content. So most of the time, suppose we’re starting to return a set of search results and we’ve got two pages that are actually kind of identical. Typically we would say, “OK, rather than show both of those pages since they’re duplicates, let’s just show one of those pages and we’ll crowd the other result out,” and then if you get to the bottom of the search results and you really want to do an exhaustive search, you can change the filtering so that you can say, “OK, I want to see every single page” and then you’d see that other page. But for the most part, duplicate content isn’t really treated as spam. It’s just treated as something we need to cluster appropriately and we need to make sure that it ranks correctly, but duplicate content does happen."
Read more from this article here: https://searchenginewatch.com/sew/news/2319706/googles-matt-cutts-a-little-duplicate-content-wont-hurt-your-rankings
With this in mind, I do think your assumption is correct. If you make sure that any location that could be seen as competing areas has unique content, they won't necessarily be dinged for duplicated content. Unless you were trying to rank nationally, this shouldn't be a major problem for each individual website that is targeting a different location.
-
Thanks for your response. We would love to move to a single-domain, but unfortunately the client won't allow us to make that change.
I agree that ideally all 250 locations would have unique content, but I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.
Also, my other concern is; is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.
-
I would also make them into one big website.
But at the same time, I would have full unique content for each of the 250 locations. I know that sounds like a huge expense and a lot of work, but any company who has the resources to support 250 locations can support the small expense of unique content for each of them.
-
I completely understand where you are coming from, but I can only advise that you scrap all of the individual sites and make them into one big website. I know that sounds easier than it really is and there are most likely some complications that prevented them from doing it in the first place but it really is the best thing to do.
I do believe that the duplication will still matter, even if you only have one office/store in that location.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Business has multiple locations, but want to rank for commutable cities, geographies
Hello, The business I am working for has multiple locations, but the service they provide is one that you would commute for. At present, they have 20 or so pages with yucky geographical keyword stuffed content (think "New York computer services" and they are based out of a suburb (maybe 40 miles away). For some ridiculous reason, some of these pages are ranking for exact match search terms? We are in the process of revamping the whole site-taking approx five sites and integrating into one mega site. I want to first, figure out the best strategy for ranking for the region that each is in and serve, without being spammy like the previous SEO. I want to eliminate the spammy pages without losing the rank and link juice. What is the most appropriate and above-board strategy? These are my thoughts. Should I: 1. Keep the pages, but tweak them enough to make the content quality? If I do, should they be geo pages? Should they be "locations served", statistics of the area, etc? 2. Group the pages according to region (one page per region) that are location-oriented and tweaked to still include the terms they were ranking for (without the spammy look and stuffing), along with a map, etc? And then, I have to figure out how to redirect so not to lose the value we have now for some of them. The company deals with treatment for addiction, so in recommending and tips-remember that our audience will commute by car, and eventually (hopefully) by plane. 😉 Thank you so so much for any and all help you can provide! Sorry for such a long description!
Local Website Optimization | | lfrazer1231 -
City Pages for Local SEO
Hey Mozzers, I have a local SEO question for you. I am working with a medical professional to SEO their site. I know that when creating city pages, you want to try and make each page as strong as you can, showcasing testimonials from people who live in those towns, for instance. Since my client is in the medical profession, i was going to include a list of parks from that town and say something about how, "we want to encourage good health, etc." However, i began to wonder whether i should just create one, large resource for the surrounding towns having to do with parks, dog parks, and athletic activities and link to it in the top nav. thoughts? Nails
Local Website Optimization | | matt.nails0 -
How many SEO clients do you handle?
I work in a small web & design agency who started offering SEO 2 yrs ago as it made sense due to them building websites. There have been 2 previous people to me and I now work there 3 days a week and they also have a junior who knew nothing before she started working for us. She mainly works for me. My question is, how many clients do you think would be reasonable to work on? We currently have around 55 and I have been working there for nearly 5 months now and haven't even got to half of the sites to do some work on. I've told them the client list is way too big and we should only have around 15 clients max. However they don't want to lose the money from the already paying clients so won't get rid of any and keep adding new ones Their systems were a mess and had no reporting or useful software so I had to investiagte and deploy that, along with project management software. Their analytics is also a mess and have employed a contractor to help sort that out too. It's like they were offering SEO services but had no idea or structure to what they did. Meta descriptions were cherry picked which ones to be done, so say 50/60 on a site not filled in. So it's not like I have 45 or so well maintained accounts. They're all a mess. Then the latest 10 new ones are all new sites so All need a lot of work. I'm starting to feel incredibly overwhelmed and oppressed by it all and wanted to see what other SEO professionals thought about it. Any thoughts would be appreciated.
Local Website Optimization | | hanamck0 -
Passing Juice through Multiple Locations
Hey Gang, Thank you in advance for taking some time out of your day to read/comment on this. I really am thankful for this awesome community. SO, I just took over a locksmith client with over 20 different locations all up and down the west coast. They have some of their Google My Businesses ranking in the snap three. But most of them are not even close. The SEO that they had done was very 2012 and very messy. They have the name of the cities in their GMB profiles which is against google policy (although we haven't got taken down) Example: Instead of Locksmith plus they have Locksmith Plus Portland or Locksmith Plus Seattle. So their Citations are all over the place. Some locations have a bunch, and some locations I haven't even been able to put them on Yelp or Super pages (because they do not accommodate well at all for multi location business it's kind of been a nightmare) And Besides mediocre citations their websites are all over the place to. None of them are Linked to each other they each look like a separate brand. So here's my question(s) 1. I have a pretty good PBN network of my own real websites for clients that I have ranked to page one. I want to start Backlinking to just our one Main locksmith site (that ranks for no city) an have that juice flow into all the other sites but I am afraid I wont interlink them correctly and the juice will get wasted. Should I have like all the links to every cities website on the front page and point all my pbn at the front page? How to I link these bad boys correctly? Or should I... (next question) 2. Ok I know the Google my business does not care about how many citations we have but rather the quality of those citations. I already know we are having a brand crisis. We need to change all these listings to the same brand name but I am afraid google will spank us once we change and take down our number ones (so be it?) But My question is how much should I focus on back linking some of these page listings. Like should I be posting the naked Yelp URL on some of my web 2.0s (that link back to my main website)? Or what if i just had the main citations on the cities website so they could get some juice too? Confusing! Overall I know that Google wants clean consistent branding and that what we want to do.I just want to make sure everything is hooked up right so when I do make some Bad a** big content that every location can benefit from it. Guys thank you again. Much Loves and I hope every body had a great new year. Here's to a strong 2016
Local Website Optimization | | Meier0 -
What's with Google? All metrics in my favor, yet local competitors win.
In regards to local search with the most relevant keyword, I can't seem to get ahead of the competition. I've been going through a number of analytics reports, and in analyzing our trophy keyword (which is also the most relevant, to our service and site) our domain has consistently been better with a number of factors. There is not a moz report that I can find that doesn't present us as the winner. Of course I know MOZ analytics and google analytics are different, but I'm certain that we have them beat with both. When all metrics seem to be in our favor, why might other competitors continue to have better success? We should be dominating this niche industry. Instead, I see a company using blackhat seo, another with just a facebook page only, and several others that just don't manage their site or ever add unique, helpful content. What does it take to get ahead? I'm pretty certain I've been doing everything right, and doing everything better than our local competitors. I think google just has a very imperfect algorythm, and the answer is "a tremendous amount of patience" until they manage to get things right.
Local Website Optimization | | osaka730 -
SEO for local business directory type site
I am thinking about creating a local business directory type website that lists all local Tattoo Shops. I am familiar with both local and global SEO and how to differentiate between them, however, I am not sure how I should approach this type of website. It isn't an actual business, but I want to target local searches that are looking for tattoo shops. In other words, when someone types in "tattoo shops" or "tattoo shops near me", or "tattoo parlors", I want the website to appear. Is this something that is manageable, or will the individual Tattoo Shop websites always show before mine since they are real local businesses with google+ pages?
Local Website Optimization | | brfieger0 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0 -
How do I fix duplicate content issues if the pages are really just localized versions?
Does this still hurt our SEO? Should we place different countries on their own respective domains (.co.uk, etc)?
Local Website Optimization | | fdmgroup0