Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best practice for franchise sites with duplicated content
-
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues.
All sites are hosted on the same server therefor the same IP address
All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate
Almost all sites have the same design (A few of the groups we work with have multiple design options)
Any suggestions would be greatly appreciated.
Thanks Again
Aaron
-
I fully agree. We have notified them all and let them know its in their best interest to modify the content throughout the site. Unfortunately most of them don't and the copy remains templated.
Thanks for your answers
-
If the search is for Company product or service, You can take little advantage by doing local listing of each franchisee. Except This content rewriting is only option as per my knol.
-
Maybe part of the literature describing your program can include the point that to be really effective the franchisee's will have to write their own content. It all depends on your business model, whether you want to make them aware that they have 100-5,000 competitors from your company alone.
-
I fully agree with you EGOL "There is another problem - maybe bigger than Google's desire for unique content."
We give each franchisee the opportunity to expand on the content and make it their own, however I would say 90% of them don't make any changes.
I don't think that either the franchisee or corporate would want to pay the $$$ it would cost to have our Copywriters write unique copy for each site. (50-100+ products/services) per site or franchisee.
-
I wish we could redo the strategy but we aren't talking about small franchises here. We are talking franchises anywhere from 100 stores all the way up to 5,000 stores.
The products/services they offer are described very well and unfortunately the only thing we add into each product page is maybe a few location identifiers and a company name.
I don't want to use the canonical solution because each site has to be seen as a stand along site.
-
Each Franchise has their own domain.
Each Product/Service has a single description - Each franchisee has to use the same corporate approved logo.
All Images are named the same thing so it can matter.
I like your suggestions though...you are going the same route we have in the past.
-
Information about Google using OCR... Use this link to see an example of how google extracted and highlighted "wrigley swim" from this newspaper scan.
Google can determine the color of an image... image files are actually characters and google can extract the colors. If you go into image search there is an option limit the results by color. Some of that is done via context (such as words in the file name or words near the image), however, some is done by extracting data from the image file.
-
Here we are all giving advices based on their own knowledge. So i personally think Google cannot read images or what a specific image relates to. If I'm wrong and I hope I'm not ... can i get more details EGOL
Thanks.
-
...Google cannot read images or colors...
Are you willing to bet a month's pay on that?
-
I want to make sure that Google sees each one of these sites as unique sites...
I don't think that there is an inexpensive way to get this done and have high quality results. If you want unique content you gotta pay the price... but you could consider.
Hire several writers to reauthor the content - will cost a lot less than starting from scratch.
Get an article spinner program - that will be cheap but you will probably not like the results.
Make a enthusiastic sales pitch to each franchisee with incentives to write their own content.
...templated content approved by corporate...
There is another problem - maybe bigger than Google's desire for unique content.
Good luck.
-
You may want to re-think your strategy of franchising the product and the content. If the content is the same the only way to eliminate the duplicate content problem is to point to one of them as the canonical version, and that would very much impact the performance of the other versions of the other sites.
-
I suggest your give the products (franchise) use their own sort of domain(logo) but add franchise [your logo].
1. Their Own Domain
2. Their own product description even if it's the same product (maybe add your logo to make sure people recognizes the brand.
3. Design does not matter (urls, title, description, content etc counts) as Google cannot read images or colors
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Thoughts on archiving content on an event site?
I have a few sites that are used exclusively to promote live events (ex. tradeshows, conference, etc). In most cases these sites content fewer than 100 pages and include information for the upcoming event with links to register. Some time after the event has ended, we would redesign the site and start promoting next years event...essentially starting over with a new site (same domain). We understand the value that many of these past event pages have for users who are looking for info from the past event and we're looking for advice on how best to archive this content to preserve for SEO. We tend to use concise urls for pages on these sites. Ex. www.event.com/agenda or www.event.com/speakers. What are your thoughts on archiving the content from these pages so we can reuse the url with content for the new event? My first thought is to put these pages into an archive, like www.event.com/2015/speakers. Is there a better way to do this to preserve the SEO value of this content?
On-Page Optimization | | accessintel0 -
Duplicate content on partner site
I have a trade partner who will be using some of our content on their site. What's the best way to prevent any duplicate content issues? Their plan is to attribute the content to us using rel=author tagging. Would this be sufficient or should I request that they do something else too? Thanks
On-Page Optimization | | ShearingsGroup0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
What's the best practice for handling duplicate content of product descriptions with a drop-shipper?
We write our own product descriptions for merchandise we sell on our website. However, we also work with drop-shippers, and some of them simply take our content and post it on their site (same photos, exact ad copy, etc...). I'm concerned that we'll loose the value of our content because Google will consider it duplicated. We don't want the value of our content undermined... What's the best practice for avoiding any problems with Google? Thanks, Adam
On-Page Optimization | | Adam-Perlman0 -
Best Domain Name for Life Coaching Site
Hello, I am an NLP health coach. I am starting to work with both life threatening illnesses and minor diagnoses. NLP is a type of personal development. I'm wondering what your opinion of the best domain would be, keeping in mind branding, SEO, and usability/rememberability. The term "NLP" is not well known. I will be doing both phone coaching and in-person coaching. My other website (BobWeikel.com) is not very strong because of the lack of keywords in the domain, but it's easy to remember. Options are: NLPTrained.com BobWeikelHealthCoach.com BoiseHealthCoach.com (I'm in Boise Idaho) RobertWeikel.com or whatever you suggest.
On-Page Optimization | | BobGW0 -
How long should anchor text be? Best practice for anchor text length?
site: http://www.cerritosnissan.com/index.htm On the bottom of this homepage there is an seo content area, basically right under where it says "orange county nissan" welcomes you. The internal links in this area are very long and I'm wondering why they would do this - is there any benefit to making anchor text longer? The longer the anchor text, the less each part of that anchor text passes link juice. For example, for a page about their reviews, the anchor text of the link is "See what Cerritos Nissan customers have to say about their experience at this great Orange County Nissan Dealership.". If I would have done this the anchor text would be "Cerritos Nissan Reviews" or just plain "reviews" as the anchor text. Why would they be using such long keywords as anchor text?
On-Page Optimization | | qlkasdjfw
0 -
Best Practice for Deleting Pages
What is the best SEO practice for deleting pages? We have a section in our website with Employee bios, and when the employee leaves we need to remove their page. How should we do this?
On-Page Optimization | | Trupanion0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5