Duplicate Content - Local SEO - 250 Locations
-
Hey everyone,
I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc.
I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates.
So here's my question:
If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct?
Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations.
I really appreciate any insight!
Thank you,
-
** I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.**
If Google sees pages on your site that are substantially duplicate. It will filter all but one of them from the SERPs.
** is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.**
Yes. The reward is enormous. Ha.
-
Hey There!
In my view, the client has 2 options here:
-
Spring for unique content on the 250 site
-
Reconsider his decision about bringing everything into a single site. The question you've asked (can you really write about the identical service 250 times) is exactly why he should see his strategy is cumbersome. Ideally, you'd have a good handful of unique pages describing benefits of the service and would then have 250 semi-unique pages on the website, one for each physical location.
-
-
Hi SEO Team @ G5!
Since you are unable to create one large domain that houses all of the locations, I would attempt to make each of the websites as "unique" as possible. But keep in mind that unique content doesn't necessarily mean that you need to completely reword the content in different ways 250 times. Small changes can make a big difference.
There's a great (and short) video of Google's Matt Cutts talks about how Google handles duplicate content. There's also another helpful video about it here.
Matt Cutts has said, "Google looks for duplicate content and where we can find it, we often try to group it all together and treat it as of it’s just one piece of content. So most of the time, suppose we’re starting to return a set of search results and we’ve got two pages that are actually kind of identical. Typically we would say, “OK, rather than show both of those pages since they’re duplicates, let’s just show one of those pages and we’ll crowd the other result out,” and then if you get to the bottom of the search results and you really want to do an exhaustive search, you can change the filtering so that you can say, “OK, I want to see every single page” and then you’d see that other page. But for the most part, duplicate content isn’t really treated as spam. It’s just treated as something we need to cluster appropriately and we need to make sure that it ranks correctly, but duplicate content does happen."
Read more from this article here: https://searchenginewatch.com/sew/news/2319706/googles-matt-cutts-a-little-duplicate-content-wont-hurt-your-rankings
With this in mind, I do think your assumption is correct. If you make sure that any location that could be seen as competing areas has unique content, they won't necessarily be dinged for duplicated content. Unless you were trying to rank nationally, this shouldn't be a major problem for each individual website that is targeting a different location.
-
Thanks for your response. We would love to move to a single-domain, but unfortunately the client won't allow us to make that change.
I agree that ideally all 250 locations would have unique content, but I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.
Also, my other concern is; is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.
-
I would also make them into one big website.
But at the same time, I would have full unique content for each of the 250 locations. I know that sounds like a huge expense and a lot of work, but any company who has the resources to support 250 locations can support the small expense of unique content for each of them.
-
I completely understand where you are coming from, but I can only advise that you scrap all of the individual sites and make them into one big website. I know that sounds easier than it really is and there are most likely some complications that prevented them from doing it in the first place but it really is the best thing to do.
I do believe that the duplication will still matter, even if you only have one office/store in that location.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do Nation wide business win Local?
For example, this site https://www.kvetinyhned.cz for selling flowers is ahead of the sales of local stores that have a site but aimed only at sales in a certain area of the store
Local Website Optimization | | Martin11Martin0 -
Service area local seo
Hello, everyone. I am struggling a little with the vast amounts of information about how best to get a local service area business ranking and the best practice. If I explain what I have been doing and then see how I can improve. I have created a couple of websites for window cleaners. These window cleaners offer several services like window cleaning, gutter cleaning, conservatory cleaning, pressure washing etc. They also cover several towns/cities so it's important for them to be able to target all these areas in search. They don't have multiple offices so only have one home/office address and by the nature of the job provide services at the customer's house/business. What I have been doing is creating a page for each service they provide then to cover the areas I have been doing two things. Creating a page on the site called areas covered with a list of the areas they cover and also adding in the title of the page the main one or two areas that are most important to them. From what I can gather this might not be the best approach?? Google may see the areas in titles as keyword stuffing? Google also doesn't like a list of areas in one go anywhere on a site which can also seem like keyword stuffing? So for an example, this would be a rough title structure of service pages Window cleaners in town/city, town/city and town/city Gutter Cleaners in town/city, town/city and town/city As I said I am not sure this is the best way to do this from what I read. I have read about area specific pages but i struggle to see how i could make each area specific page unique enough as the service is exactly the same in each area. I have also read that putting the most important keywords at the begingin of the the title is better so using the above example would this be better? town/city window cleaners - business name So from what i understand having pages like this might be better Window cleaners town/city1 Window cleaners town/city2 Window cleaners town/city3 Gutter Cleaners town/city1 Gutter Cleaners town/city2 Gutter Cleaners town/city3 and so on but like I say I am aware each of these area specific pages would need to be unique but being that the services are exactly the same in each area I am not sure how I could warrant creating all the pages. Writing about the specific area on the page seems a little odd in that the visitor who lands on that page doesn't want to learn about their area, they live there and know the area. They want to know what the service is and if they do in fact cover their area. In which case how can i best ensure all or most of the areas they cover are targeted and show in search? Some sites i have done cover around 20-30 towns around them so how can best ensure they rank for them? I have also been reading conflicting information about how to structure pages and urls. Some say don't use commas in page titles, some say don't use underscores and only use hyphens. Similarly, I have read that the URL should not contain any hyphens but I am not sure about this seeing as WordPress often adds hyphens between words in URLs. Some say you should always have an H1 on every page others say it's not all that important anymore. With images, i have also been giving them alts the same as the page titles thay are on, is this the wrong thing to do? Id be happy to private messge (if i can do that here) one of the sites I would be eternally grateful if anyone can help in firstly clarifying how I could best improve ranking for areas covered and secondly what best practice is to structure page content like H1's image alts etc. Thanks
Local Website Optimization | | Gavpeds0 -
Content Strategy – Blog Channel Questions
We are currently blogging at a high volume to hit keywords for our 1,500 locations across the country. We are trying to make sure we rank well near each location and we have been using our blog to create content for that reason. With recent changes on Google, I am seeing that it is more about content topics than hitting all variations of your keywords and including state and city specific terms. We are now asking ourselves if the blog channel portion of our content strategy is incorrect. Below are some of the main questions we have and any input that is backed by experience would be helpful. 1. Can it hurt us to blog at a high volume (4 blogs per day) in an effort to include all of our keywords and attach them to state and city specific keywords (ie. "keyword one" with "keyword one city" and "keyword one different city")? 2. Is it more valuable to blog only a couple of times per month with deeper content, or more times per month with thinner connect but more keyword involvement? 3. Our customers are forced to use our type of product by the government. We are one of the vendors that provide this service. Because of this our customers may not care at all about anything we would blog about. Do we blog for them, or do we blog for the keyword and try and reach partners and others who would read the content and hope that it also ranks us high when our potential customers search? 4. Is there an advantage/disadvantage or does it matter if we have multiple blog authors? Big questions for sure, but if you have insight on any one of them, please provide and maybe we can answer them all with a group effort. Thanks to all of you who are taking the time to read this and contribute.
Local Website Optimization | | Smart_Start0 -
Best practices or tools for an SEO audit?
I would like to have an SEO audit of my site, and I'm looking for something beyond the basics. Prices for an audit seem to vary between free (for DIY tools) to over $3,000. That seems like an awfully big spread. What are the best practices I should be looking for or best tools to be looking at for a comprehensive SEO audit?
Local Website Optimization | | micromano0 -
Schema training/resources for local SEO?
I am currently in the process of apply schema for dozens of clients (many are large retailers). Although I am not a developer, I do know the basics of schematic markup & structured data. I do work with a development team and I'm trying to provide them with schema application best practices. Obviously there are many good articles/blog posts out there about schema. However I'm looking for a more substantial training course, webinar or resource website about schema application. Does anybody have any good recommendations?
Local Website Optimization | | RosemaryB0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
How can I fully take advantage of press coverage to aid my SEO efforts?
I run the digital marketing for a local start-up that's ranking for groups of semi-related keywords. We've been around for about 6 months in beta and have recently (a few days ago) done our official launch. We're starting to get some coverage in local media and I've tried my best to ensure that links to our site are included with a good range of keywords. What else can I do to fully take advantage of the press coverage that will be coming our way?
Local Website Optimization | | NgEF0 -
How to rank in Local Google Without physical address and phone number?
Can We Rank Well in Local Google without Physical address and Phone Number??? If Yes. How??
Local Website Optimization | | Dan_Brown10