How to Handle Franchise Duplicate Content
-
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns.
- Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to)
- Should we just continue to use robots.txt/noindex for all duplicate pages for now?
- Any other recommendations?
Thanks in advance!
-
It sounds like you are already doing as well as you can - since there's no clear canonical page, noindexing the duplicate pages would probably be the way to go. Don't panic if you see some duplicate pages still sneak into the index after you've noindexed them; this is common and it's unlikely that Google will see this as a Panda-worthy problem on your part.
The one drawback to noindexing the pages is that when unique content is up on them, and they are ready to be indexed, it may take a while for Google to get the message that this page is supposed to be indexed now. I've seen it take anywhere from an hour to a week for a page to appear in the index. One thing you can do in the meantime is make sure each site is accruing some good links - not an easy task with 80 websites, I know, but the higher authority will help out once the unique content is ready to go. Sounds like a herculean task - good luck!
-
Solid insight, but unfortunately we do have the 80 websites because the owners of the store manage each separately. Some stores offer different products or services than others and are completely separate entities. Each store owner that we work with is an individual client; we do not work with corporate. Plus, since we don't do marketing for ALL stores in the entire franchise, just a large chunk of them, one big site just wouldn't work. Also, it's really not possible for us to make all these store owners write their own content for the entire site.
We really appreciate your thought on this and totally agree with your logic, but unfortunately would not be able to implement either solution. Right now, we just need some kind of bandaid solution to utilize as we work through rewriting the most important pages on the site (probably either de-indexing them or some kind of canonical strategy).
Thanks!
-
Hey There!
Important question ... why does the company have 80 websites? Are they being individually managed by the owner of each store, or are they all in the control of the central company?
If the latter, what you are describing is a strong illustration supporting the typical advice that it is generally better to build 1 powerhouse website for your brand than a large number of thin, weak, duplicative sites.
If this company was my client, I would be earnestly urging them to consolidate everything into a single site. If they are currently investing in maintaining 80 website, there's reason to hope that they've got the funding to develop a strong, unique landing page for each of the 80 locations on their main corporate website, and redirect the old sites to the central one. Check out how REI.com surfaces unique pages for all of their locations. It's inspiring how they've made each page unique. If your client could take a similar approach, they'd be on a better road for the future.
You would, of course, need to update all citations to point to the landing pages once you had developed them.
If, however, the 80 websites are being controlled by 80 different franchise location managers, what needs to be developed here is a policy that prevents these managers from taking the content of the corporation. If they want to each run a separate website, they need to take on the responsibility of creating their own content. And, of course, the corporate website needs to be sure it doesn't have internal duplicate content and is not taking content from its franchise managers, either. 80 separate websites should = 80 totally separate efforts. That's a lot to have going on, pointing back to the preferred method of consolidation wherever possible.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question about partial duplicate content on location landing pages of multilocation business
Hi everyone, I am a psychologist in private practice in Colorado and I recently went from one location to 2 locations. I'm currently updating my website to better accommodate the second location. I also plan continued expansion in the future, so there will be more and more locations as time goes on. As a result, I am making my websites current homepage non-location specific and creating location landing pages as I have seen written about in many places. My question is: I know that location landing pages should have unique content, and I have plenty of this, but how much content is it also okay to have be duplicate across the location landing pages and the homepage? For instance, here is the current draft of the new homepage (these are not live yet): http://www.effectivetherapysolutions.com/dev/ And here are the drafts of the location landing pages: http://www.effectivetherapysolutions.com/dev/denver-office http://www.effectivetherapysolutions.com/dev/colorado-springs-office And for reference, here is the current homepage that is actually live for my single Denver location: http://www.effectivetherapysolutions.com/ As you can see, the location landing pages have the following sections of unique content: Therapist picture at the top testimonial quotes (the one on the homepage is the only thing I have I framed in this block from crawl so that it appears as unique content on the Denver page) therapist bios GMB listing driving directions and hours and I also haven't added these yet, but we will also have unique client success stories and appropriately tagged images of the offices So that's plenty of unique content on the pages, but I also have the following sections of content that are identical or nearly identical to what I have on the homepage: Intro paragraph blue and green "adult" and child/teen" boxes under the intro paragraph "our treatment really works" section "types of anxiety we treat" section Is that okay or is that too much duplicate content? The reason I have it that way is that my website has been very successful for years at converting site visitors into paying clients, and I don't want to lose aspects of the page that I know work when people land on it. And now that I am optimizing the location landing pages to be where people end up instead of the homepage, I want them to still see all of that content that I know is effective at conversion. If people on here do think it is too much, one possible solution is to turn parts of it into pictures or put them into I-frames on the location pages so Google doesn't crawl those parts of the location pages, but leave them normal on the homepage so it still gets crawled on there. I've seen a lot written about not having duplicate content on location landing pages for this type of website, but everything I've read seems to refer to entire pages being copied with just the location names changed, which is not what I'm doing, hence my question. Thanks everyone!
Local Website Optimization | | gremmy90 -
Content Strategy – Blog Channel Questions
We are currently blogging at a high volume to hit keywords for our 1,500 locations across the country. We are trying to make sure we rank well near each location and we have been using our blog to create content for that reason. With recent changes on Google, I am seeing that it is more about content topics than hitting all variations of your keywords and including state and city specific terms. We are now asking ourselves if the blog channel portion of our content strategy is incorrect. Below are some of the main questions we have and any input that is backed by experience would be helpful. 1. Can it hurt us to blog at a high volume (4 blogs per day) in an effort to include all of our keywords and attach them to state and city specific keywords (ie. "keyword one" with "keyword one city" and "keyword one different city")? 2. Is it more valuable to blog only a couple of times per month with deeper content, or more times per month with thinner connect but more keyword involvement? 3. Our customers are forced to use our type of product by the government. We are one of the vendors that provide this service. Because of this our customers may not care at all about anything we would blog about. Do we blog for them, or do we blog for the keyword and try and reach partners and others who would read the content and hope that it also ranks us high when our potential customers search? 4. Is there an advantage/disadvantage or does it matter if we have multiple blog authors? Big questions for sure, but if you have insight on any one of them, please provide and maybe we can answer them all with a group effort. Thanks to all of you who are taking the time to read this and contribute.
Local Website Optimization | | Smart_Start0 -
Migrating to new website with new name and new content
Hi for the past few years I have been running a personal training company from the following domain name www.smpt.me. This has done well in the past and so has some authority in google as it was ranking well on page 1. Over the last 6 months I have set up a new website with some new business partners using the domain name www.healthbyscience.co.uk. This new website, whilst still a personal training website, has different content to the original. We want to use the new website rather than the old one and therefore my question is how I can use the old website to assist with the new website. Thanks
Local Website Optimization | | Health-by-Science0 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
What is the optimal approach for a new site that has geo-targeted content available via 2 domains?
OK, so I am helping a client with a new site build. It is a lifestyle/news publication that traditionally has focused on delivering content for one region. For ease of explanation, let's pretend the brand/domain is 'people-on-the-coast.com'. Now they are now looking to expand their reach to another region using the domain 'people-in-the-city.com'. Whilst on-the-coast is their current core business and already has some search clout, they are very keen on the city market and the in-the-city domain. They would like to be able to manage the content through one CMS (joomla) and the site will deliver articles and the logo based on the location of the user (city or coast). There will also be cases where the content is duplicated for both regions. The design/layout etc. will all remain identical. So what I am really wanting to know is the pros, cons and ultimately the best approach to handle the setup and ongoing management from an SEO (and UX) perspective. All I see is problems! Any help would be greatly appreciated! Thanks,
Local Website Optimization | | bennyt
Confused O.o0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Website and eshop with the same product descrition is duplicate content
Hi there! I'm building a website that is divided in a "marketing" and "shop" sections. The 2 sites are being authored by two companies (my company is doing the marketing one). The marketing site has all the company products while the shop will sell just some of those. I'm facing the problem of duplicated content and want to ask you guys if it will be a problem/mistake to use the same product description (and similar url) for the same product in both sites, and the right way to do it (without rewriting product descriptions). the main site will be : www.companyname.com
Local Website Optimization | | svitol
the shop will be: shop.companyname.com thanks
Francesco0 -
Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname. The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100. A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory. **The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization. What should we do? What have I missed?
Local Website Optimization | | scottclark0