How to Handle Franchise Duplicate Content
-
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns.
- Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to)
- Should we just continue to use robots.txt/noindex for all duplicate pages for now?
- Any other recommendations?
Thanks in advance!
-
It sounds like you are already doing as well as you can - since there's no clear canonical page, noindexing the duplicate pages would probably be the way to go. Don't panic if you see some duplicate pages still sneak into the index after you've noindexed them; this is common and it's unlikely that Google will see this as a Panda-worthy problem on your part.
The one drawback to noindexing the pages is that when unique content is up on them, and they are ready to be indexed, it may take a while for Google to get the message that this page is supposed to be indexed now. I've seen it take anywhere from an hour to a week for a page to appear in the index. One thing you can do in the meantime is make sure each site is accruing some good links - not an easy task with 80 websites, I know, but the higher authority will help out once the unique content is ready to go. Sounds like a herculean task - good luck!
-
Solid insight, but unfortunately we do have the 80 websites because the owners of the store manage each separately. Some stores offer different products or services than others and are completely separate entities. Each store owner that we work with is an individual client; we do not work with corporate. Plus, since we don't do marketing for ALL stores in the entire franchise, just a large chunk of them, one big site just wouldn't work. Also, it's really not possible for us to make all these store owners write their own content for the entire site.
We really appreciate your thought on this and totally agree with your logic, but unfortunately would not be able to implement either solution. Right now, we just need some kind of bandaid solution to utilize as we work through rewriting the most important pages on the site (probably either de-indexing them or some kind of canonical strategy).
Thanks!
-
Hey There!
Important question ... why does the company have 80 websites? Are they being individually managed by the owner of each store, or are they all in the control of the central company?
If the latter, what you are describing is a strong illustration supporting the typical advice that it is generally better to build 1 powerhouse website for your brand than a large number of thin, weak, duplicative sites.
If this company was my client, I would be earnestly urging them to consolidate everything into a single site. If they are currently investing in maintaining 80 website, there's reason to hope that they've got the funding to develop a strong, unique landing page for each of the 80 locations on their main corporate website, and redirect the old sites to the central one. Check out how REI.com surfaces unique pages for all of their locations. It's inspiring how they've made each page unique. If your client could take a similar approach, they'd be on a better road for the future.
You would, of course, need to update all citations to point to the landing pages once you had developed them.
If, however, the 80 websites are being controlled by 80 different franchise location managers, what needs to be developed here is a policy that prevents these managers from taking the content of the corporation. If they want to each run a separate website, they need to take on the responsibility of creating their own content. And, of course, the corporate website needs to be sure it doesn't have internal duplicate content and is not taking content from its franchise managers, either. 80 separate websites should = 80 totally separate efforts. That's a lot to have going on, pointing back to the preferred method of consolidation wherever possible.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I avoid duplicate url keywords?
I'm curious to know Can having a keyword repeat in the URL cause any penalties ? For example xyzroofing.com xyzroofing.com/commercial-roofing xyzroofing.com/roofing-repairs My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way. Also One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
Local Website Optimization | | Lyontups0 -
International subdirectory without localized content - best practice / need advice
Hi there, Our site uses a subdirectory for regional and multilingual sites as show below for 200+ countries.
Local Website Optimization | | erinfalwell
EX: /en_US/ All sites have ~the same content & are in English. We have hreflang tags but still have crawl issues. Is there another URL structure you would recommend? Are there any other ways to avoid the duplicate page & crawl budget issues outside of the hreflang tag? Appreciate it!0 -
Image URLs changed 3 times after using a CDN - How to Handle for SEO?
Hi Mozzers,
Local Website Optimization | | emerald
Hoping for your advice on how to handle the SEO effects an image URL change, that changed 3 times, during the course of setting up a CDN over a month period, as follows: (URL 1) - Original image URL before CDN:www.mydomain.com/images/abc.jpg (URL 2) - First CDN URL (without CNAME alias - using WPEngine & their own CDN):
username.net-dns.com/images/abc.jpg (URL 3) - Second CDN URL (with CNAME alias - applied 3 weeks later):
cdn.mydomain.com/images/abc.jpg When we changed to URL 2, our image rankings in the Moz Tool Pro Rankings dropped from 80% to 5% (the one with the little photo icons). So my questions for recovery are: Do I need to add a 301 redirect/Canonical tag from the old image URL 1 & 2 to URL 3 or something else? Do I need to change my image sitemap to use cdn.mydomain.com/images/abc.jpg instead of www.? Thanks in advance for your advice.0 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0 -
Want to move contents to domain2 and use domain1 for other content
Hello, We would like to merge two existing, fairly well positioned web forums. Contents (threads and posts) from www.forocreativo.net would be moved to www.comunidadhosting.com. We are testing some scripts which will handle redirect 301 for every single thread from forocreativo.net to comunidadhosting.com. But here is the thing: once all current contents are moved out of www.forocreativo.net, we would like to use this domain to point it to a specific geographic region and to target other niche/topics. Would you say we can do this and Google will not penalize neither of those 2 domains? Any input is more than welcome. Thank you! 🙂
Local Website Optimization | | interalta0 -
Website and eshop with the same product descrition is duplicate content
Hi there! I'm building a website that is divided in a "marketing" and "shop" sections. The 2 sites are being authored by two companies (my company is doing the marketing one). The marketing site has all the company products while the shop will sell just some of those. I'm facing the problem of duplicated content and want to ask you guys if it will be a problem/mistake to use the same product description (and similar url) for the same product in both sites, and the right way to do it (without rewriting product descriptions). the main site will be : www.companyname.com
Local Website Optimization | | svitol
the shop will be: shop.companyname.com thanks
Francesco0 -
Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname. The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100. A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory. **The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization. What should we do? What have I missed?
Local Website Optimization | | scottclark0