Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to Handle Franchise Duplicate Content
-
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns.
- Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to)
- Should we just continue to use robots.txt/noindex for all duplicate pages for now?
- Any other recommendations?
Thanks in advance!
-
It sounds like you are already doing as well as you can - since there's no clear canonical page, noindexing the duplicate pages would probably be the way to go. Don't panic if you see some duplicate pages still sneak into the index after you've noindexed them; this is common and it's unlikely that Google will see this as a Panda-worthy problem on your part.
The one drawback to noindexing the pages is that when unique content is up on them, and they are ready to be indexed, it may take a while for Google to get the message that this page is supposed to be indexed now. I've seen it take anywhere from an hour to a week for a page to appear in the index. One thing you can do in the meantime is make sure each site is accruing some good links - not an easy task with 80 websites, I know, but the higher authority will help out once the unique content is ready to go. Sounds like a herculean task - good luck!
-
Solid insight, but unfortunately we do have the 80 websites because the owners of the store manage each separately. Some stores offer different products or services than others and are completely separate entities. Each store owner that we work with is an individual client; we do not work with corporate. Plus, since we don't do marketing for ALL stores in the entire franchise, just a large chunk of them, one big site just wouldn't work. Also, it's really not possible for us to make all these store owners write their own content for the entire site.
We really appreciate your thought on this and totally agree with your logic, but unfortunately would not be able to implement either solution. Right now, we just need some kind of bandaid solution to utilize as we work through rewriting the most important pages on the site (probably either de-indexing them or some kind of canonical strategy).
Thanks!
-
Hey There!
Important question ... why does the company have 80 websites? Are they being individually managed by the owner of each store, or are they all in the control of the central company?
If the latter, what you are describing is a strong illustration supporting the typical advice that it is generally better to build 1 powerhouse website for your brand than a large number of thin, weak, duplicative sites.
If this company was my client, I would be earnestly urging them to consolidate everything into a single site. If they are currently investing in maintaining 80 website, there's reason to hope that they've got the funding to develop a strong, unique landing page for each of the 80 locations on their main corporate website, and redirect the old sites to the central one. Check out how REI.com surfaces unique pages for all of their locations. It's inspiring how they've made each page unique. If your client could take a similar approach, they'd be on a better road for the future.
You would, of course, need to update all citations to point to the landing pages once you had developed them.
If, however, the 80 websites are being controlled by 80 different franchise location managers, what needs to be developed here is a policy that prevents these managers from taking the content of the corporation. If they want to each run a separate website, they need to take on the responsibility of creating their own content. And, of course, the corporate website needs to be sure it doesn't have internal duplicate content and is not taking content from its franchise managers, either. 80 separate websites should = 80 totally separate efforts. That's a lot to have going on, pointing back to the preferred method of consolidation wherever possible.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate LocalBusiness Schema Markup
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information. Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google. Do the pros of having more detailed markup outweigh that potential negative impact?
Local Website Optimization | | GoogleAlgoServant0 -
Should I avoid duplicate url keywords?
I'm curious to know Can having a keyword repeat in the URL cause any penalties ? For example xyzroofing.com xyzroofing.com/commercial-roofing xyzroofing.com/roofing-repairs My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way. Also One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
Local Website Optimization | | Lyontups0 -
Should Multi Location Businesses "Local Content Silo" Their Services Pages?
I manage a site for a medical practice that has two locations. We already have a location page for each office location and we have the NAP for both locations in the footer of every page. I'm considering making a change to the structure of the site to help it rank better for individual services at each of the two locations, which I think will help pages rank in their specific locales by having the city name in the URL. However, I'm concerned about diluting the domain authority that gets passed to the pages by moving them deeper in the site's structure. For instance, the services URLs are currently structured like this: www.domain.com/services/teeth-whitening (where the service is offered in each of the two locations) Would it make sense to move to a structure more like www.domain.com/city1name/teeth-whitening www.domain.com/city2name/teeth-whitening Does anyone have insight from dealing with multi-location brands on the best way to go about this?
Local Website Optimization | | formandfunctionagency1 -
Using geolocation for dynamic content - what's the best practice for SEO?
Hello We sell a product globally but I want to use different keywords to describe the product based on location. For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name). What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches). I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5. I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada. I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript. Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though. Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck. Thank you so much! Laura Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
Local Website Optimization | | LauraFalls0 -
Image URLs changed 3 times after using a CDN - How to Handle for SEO?
Hi Mozzers,
Local Website Optimization | | emerald
Hoping for your advice on how to handle the SEO effects an image URL change, that changed 3 times, during the course of setting up a CDN over a month period, as follows: (URL 1) - Original image URL before CDN:www.mydomain.com/images/abc.jpg (URL 2) - First CDN URL (without CNAME alias - using WPEngine & their own CDN):
username.net-dns.com/images/abc.jpg (URL 3) - Second CDN URL (with CNAME alias - applied 3 weeks later):
cdn.mydomain.com/images/abc.jpg When we changed to URL 2, our image rankings in the Moz Tool Pro Rankings dropped from 80% to 5% (the one with the little photo icons). So my questions for recovery are: Do I need to add a 301 redirect/Canonical tag from the old image URL 1 & 2 to URL 3 or something else? Do I need to change my image sitemap to use cdn.mydomain.com/images/abc.jpg instead of www.? Thanks in advance for your advice.0 -
Location Pages and Duplicate Content and Doorway Pages, Oh My!
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services. Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc. They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well. My initial reaction agrees with this article: http://moz.com/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names". In a nutshell, Google's Guidelines seem to have a conflict on this topic: Location Pages: "Have each location's or branch's information accessible on separate webpages"
Local Website Optimization | | eyeflow
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one." Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page: Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content." ...starting to feel like I'm in a Google Guidelines Paradox! Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?0 -
Duplicate content on a proxy site?
I have a local client with a 500 page site.
Local Website Optimization | | TFinder
They advertise online and use traditional media like direct mail.
A print media company, Valpak, has started a website
And wants the client to use their trackable phone number
And a proxy website. When I type the proxy domain in the browser
It appears to be client home page at this proxy URL. The vendor
Wishes to track activity on its site to prove their value or something
My question is: is their any "authority" risk to my clients website
By allowing this proxy site??0 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0