What is the best strategy for franchise companies when building local sites?
-
Hi, if you represent a national franchise, I have noticed that Dominos and others do NOT use local websites for LOCAL SEO, rather they use their own MAMMOTH sites with a store locator for the local stores with a few NOT keyword rich pages with very basic information. However, for LOCAL SEO, I have been thinking that using e.g. Hyperfranchise.com for the main domain and then e.g. buckhead.hyperfranchise.com or buckheadhyperfrnachise.com would be better for LOCAL SEO including Yelp, FourSquare and more.It will take time to rank for all local sites, but is that not better in the end than having e.g. 6 pages of content that are "local" on the main site?
However, I have not seen any of the big ones do that, but that might be because they are so entrenched in their own OLD system that might be ranking well anyway for their local franchisees?
Any comments, ideas, suggestions?
-
Hi Yvonne,
I believe the main reason most franchises go the way you've summarized is manageability. The alternative is to entrust all work to individual franchise owners, including not only their own websites/content, but also all of their Local SEO. Can individual franchise owners be counted on to know what they are doing writing optimized copy and creating violation-free Place Pages? Maybe they can, but more often than not, I'd bet the answer to that would be no.
By maintaining control over everything that happens, you control quality. You can do just fine with Local if it's all managed from the top across the board.
On the other hand, I empathize with your wish to try it a different way. Good stuff can happen if you decide to be different, but if you see something working for tons of similar business models, reinventing the wheel may not be the safest or most successful route to go.
-
Agreed, go with subdomains or dedicated pages, depending on the amount of content you have per location. With a franchise, you're ultimately trying to rank the brand so keeping all that content on one root domain certainly has its advantages.
For local SEO, set up separate social pages that point back to your location pages (or subdomains) on your main site.
If it makes sense, offer up a feed of content from your local social sites on your local site pages so it stays nice and fresh too.
-
in terms of scalability and maintenance, I would think they'd need to be on subdomains. it would require a lot more resource and cost if each franchise had its own url.
I think facebook should be the first call for a local franchise. that instant interaction you get with your customers really gets the word out and is hard to match any other way. a competition or giveway or in the case of my local burrito company an 'all you can eat challenge' and everyones talking about you!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unable to site crawl
Hi there, our website was revamped last year and Moz is unable to crawl the site since then. Could you please check what is the issue? @siteaudits @Crawlinfo gleneagles.com.my
Technical SEO | | helensohdg380 -
Content from Another Site
Hi there - I have a client that says they'll be "serving content by retrieving it from another URL using loadHTMLFile, performing some manipulations on it, and then pushing the result to the page using saveHTML()." Just wondering what the SEO implications of this will be. Will search engines be able to crawl the retrieved content? Is there a downside (I'm assuming we'll have some duplicate content issues)? Thanks for the help!!
Technical SEO | | NetStrategies1 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect?
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect? If this scenario requires a 301 redirect no matter what, I might as well update the URL to be a little more keyword rich for the page while I'm at it. However, since these pages are ranking well I'd rather not lose any authority in the process and keep the URL just stripped of the ".html" (if that's possible). Thanks for you help! [edited for formatting]
Technical SEO | | Booj0 -
Site Migration Questions
Hello everyone, We are in the process of going from a .net to a .com and we have also done a complete site redesign as well as refreshed all of our content. I know it is generally ideal to not do all of this at once but I have no control over that part. I have a few questions and would like any input on avoiding losing rankings and traffic. One of my first concerns is that we have done away with some of our higher ranking pages and combined them into one parallax scrolling page. Basically, instead of having a product page for each product they are now all on one page. This of course has made some difficulty because search terms we were using for the individual pages no longer apply. My next concern is that we are adding keywords to the ends of our urls in attempt to raise rankings. So an example: website.com/product/product-name/keywords-for-product if a customer deletes keywords-for-product they end up being re-directed back to the page again. Since the keywords cannot be removed is a redirect the best way to handle this? Would a canonical tag be better? I'm trying to avoid duplicate content since my request to remove the keywords in urls was denied. Also when a customer deletes everything but website.com/product/ it goes to the home page and the url turns to website.com/product/#. Will those pages with # at the end be indexed separately or does google ignore that? Lastly, how can I determine what kind of loss in traffic we are looking at upon launch? I know some is to be expected but I want to avoid it as much as I can so any advice for this migration would be greatly appreciated.
Technical SEO | | Sika220 -
Is blogger could for building links
Hi i would like to know if blogger through google is a good way to build links and traffic. I have started to use blogger and would like to know if i am wasting my time using it for link building and traffic building for my main site
Technical SEO | | ClaireH-1848860 -
How to setup tumblr blog.site.com to give juice to site.com
Is it possible to get a subdomain blog.site.com that is on tumblr to count toward site.com. I hoped I could point it in webmaster tools like we do www but alas no. Any help would be greatly appreciated.
Technical SEO | | oznappies0