Duplicate Content - Local SEO - 250 Locations
-
Hey everyone,
I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc.
I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates.
So here's my question:
If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct?
Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations.
I really appreciate any insight!
Thank you,
-
** I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.**
If Google sees pages on your site that are substantially duplicate. It will filter all but one of them from the SERPs.
** is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.**
Yes. The reward is enormous. Ha.
-
Hey There!
In my view, the client has 2 options here:
-
Spring for unique content on the 250 site
-
Reconsider his decision about bringing everything into a single site. The question you've asked (can you really write about the identical service 250 times) is exactly why he should see his strategy is cumbersome. Ideally, you'd have a good handful of unique pages describing benefits of the service and would then have 250 semi-unique pages on the website, one for each physical location.
-
-
Hi SEO Team @ G5!
Since you are unable to create one large domain that houses all of the locations, I would attempt to make each of the websites as "unique" as possible. But keep in mind that unique content doesn't necessarily mean that you need to completely reword the content in different ways 250 times. Small changes can make a big difference.
There's a great (and short) video of Google's Matt Cutts talks about how Google handles duplicate content. There's also another helpful video about it here.
Matt Cutts has said, "Google looks for duplicate content and where we can find it, we often try to group it all together and treat it as of it’s just one piece of content. So most of the time, suppose we’re starting to return a set of search results and we’ve got two pages that are actually kind of identical. Typically we would say, “OK, rather than show both of those pages since they’re duplicates, let’s just show one of those pages and we’ll crowd the other result out,” and then if you get to the bottom of the search results and you really want to do an exhaustive search, you can change the filtering so that you can say, “OK, I want to see every single page” and then you’d see that other page. But for the most part, duplicate content isn’t really treated as spam. It’s just treated as something we need to cluster appropriately and we need to make sure that it ranks correctly, but duplicate content does happen."
Read more from this article here: https://searchenginewatch.com/sew/news/2319706/googles-matt-cutts-a-little-duplicate-content-wont-hurt-your-rankings
With this in mind, I do think your assumption is correct. If you make sure that any location that could be seen as competing areas has unique content, they won't necessarily be dinged for duplicated content. Unless you were trying to rank nationally, this shouldn't be a major problem for each individual website that is targeting a different location.
-
Thanks for your response. We would love to move to a single-domain, but unfortunately the client won't allow us to make that change.
I agree that ideally all 250 locations would have unique content, but I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.
Also, my other concern is; is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.
-
I would also make them into one big website.
But at the same time, I would have full unique content for each of the 250 locations. I know that sounds like a huge expense and a lot of work, but any company who has the resources to support 250 locations can support the small expense of unique content for each of them.
-
I completely understand where you are coming from, but I can only advise that you scrap all of the individual sites and make them into one big website. I know that sounds easier than it really is and there are most likely some complications that prevented them from doing it in the first place but it really is the best thing to do.
I do believe that the duplication will still matter, even if you only have one office/store in that location.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Page URL Question
Our main website is geared toward the city where we are located and includes the city name in content page URLs. We also have separate websites for three surrounding cities; these websites have duplicate content except the city name: MainWebsite.com
Local Website Optimization | | sharon75025
City2-MainWebsite.com
City3-MainWebsite.com
City4-MainWebsite.com We're restructuring to eliminate the location websites and only use the main website. The new site will have city pages. We have well established Google business locations for all four cities. We will keep all locations, replacing the location website with the main website. Should we remove City-IL from all content page URLs in the new site? We don't want to lose traffic/ranking for City2 or City3 because the content pages have City1 in the URL. Page URLs are currently formatted as follows: www.MainWebsite.com/Service-1-City1-IL.html
www.MainWebsite.com/Service-2-City1-IL.html
www.MainWebsite.com/Service-3-City1-IL.html
www.MainWebsite.com/Service-4-City1-IL.html Thanks!0 -
Checking subdomains/ site structure of a website for International SEO
Dear Moz community, I am looking into two websites for a friend and we want to understand the following: What is the site structure as per the sub domains? e.g currently it is .com/en/ or .com/ru/ or .com/zh/ Using the crawl report, each page has a en or other language version. I take it this means that we have to create copy, meta titles and descriptions for each of the languages even if the page is the same but in a different language? To avoid duplication of content would you suggest canonical tags to be put in place? To check hreflang mark up, I couldn't find anything in the code which makes me thing a script is automatically translating this? This is the first time I have started to look at international SEO and want to understand what to look for in an audit of exisiting sites. Thank you,
Local Website Optimization | | TAT1000 -
Does having a host located in a different country than the location of the website/website's audience affects SEO?
For example if the website is example.ro and the hosting would be on Amazon Web Services. Thanks for your help!
Local Website Optimization | | IrinaIoana0 -
What is the SEO effect of schema subtype deprecation? Do I really have to update the subtype if there isn't a suitable alternative?
Could someone please elaborate on the SEO effect of schema subtype deprecation? Does it even matter? The Local business properties section of developers.google.com says to: Define each local business location as a LocalBusiness type. Use the most specific LocalBusiness sub-type possible; for example, Restaurant, DaySpa, HealthClub, and so on. Unfortunately, the ProfessionalService page of schema.org states that ProfessionalService has been deprecated and many of my clients don't fit anywhere else (or if they do it's not a LocalBusiness subtype). I find it inconvenient to have to modify my different clients' JSON-LD from LocalBusiness to ProfessionalService back to LocalBusiness. I'm not saying this happens every day but how does one keep up with it all? I'm really trying to take advantage of the numerous types, attributes, etc., in structured data but I feel the more I implement, the harder it will be to update later (true of many things, of course). I do feel this is important and that a better workflow could be the answer. If you have something that works for you, please let us know. If you think it's not important tell us why not? (Why Google is wrong) I understand there is always a better use of our time, but I'd like to limit the discussion to solving this Google/Schema.org deprecation issue specifically.
Local Website Optimization | | bulletproofsearch0 -
Store Locator Apps - Which Do You Use?
Hey Everybody! I'd so appreciate feedback from our web developers and Local SEO wizards here regarding store locator apps (you know - type in a city/zip and get shown the stores nearest you). There are a number of different paid options out there on the market, and a couple of free ones. If you are managing the websites/SEO for multi-location clients, would you share with me which store locator app you chose, why you chose it and how you like it? I am particularly interested in two things about these: Does you app allow you to build a permanent landing page for each store location, including the ability to fully customize the content on that page? In terms of ensuring that these landing pages get crawled, have you used an html sitemap, some type of directory page with crawlable links or some other feature that allows bots to reach the landing pages? Or, if you're not doing any of that, do you believe Google is crawling javascript/ajax/something else to get through your store locator widget to the landing pages? Thanks, in advance, for helping me with my research on this topic!
Local Website Optimization | | MiriamEllis0 -
SEO and Redirecting Site to a Different Firm's Domain while Maintaining Current Domain's Rankings
I am a plaintiffs' attorney with a website that ranks well for my major practice areas. I am considering taking a position with a new firm. As part of the discussion, the new firm would allow me to keep my current site so long as it redirects to my bio page on their firm's site. My goal is to keep my current site ranking well and continuously work on SEO efforts, in case I leave the new firm and want to rely on my current site in the future. My questions are: Is there a way to redirect my site every time it shows up in the listings (I have 1000+ indexed pages) without sacrificing its current rankings b/c of bounce rate issues, etc and 2) If I continue to add pages and work on SEO for my site while it redirects to another, will those efforts be worthwhile due to the redirect? I want to keep trying to build my site even though it redirects to a page on a different domain.
Local Website Optimization | | crpoll0 -
Can I use a state's slang term for local search?
Have a business located in Indianapolis, Indiana. The business name will be BusinessName Indy. The URL will be BusinessName-Indy.com Since I am using Indy instead of Indianapolis or Indiana, is Google's algorithm smart enough to match up local results to my site?
Local Website Optimization | | StevenPeavey1 -
Local SEO Tools for UK
Hi guys I'm looking for any recommendations for local SEO tools in the UK? I keep stumbling across a variety of different tools but they all seem to cater for the US market only. Any tools or tips would be greatly received!
Local Website Optimization | | DHS_SH0