Need sitemap opinion on large franchise network with thousands of subdomains
-
Working on a large franchise network with thousands of subdomains. There is the primary corporate domain which basically directs traffic to store locators and then to individual locations. The stores sell essentially the same products with some variations on pricing so lots of pages with the same product descriptions.
Different content
- All the subdomains have their location information address info in the header, footer and geo meta tags on every page.
- Page titles customized with franchise store id numbers.
Duplicate content
- Product description blocks.
Franchisee domains will likely have the ability to add their own content in the future but as of right now most of the content short of the blocks on the pages are duplicated.
Likely limitations -- Adding City to page titles will likely be problematic as there could be multiple franchises in the same city.
Ideally it would be nice if users could search for the store or product and have centers return that are closest to them.
We can turn on sitemaps on all the subdomains and try to submit them to the search engines. Looking for insight regarding submitting all these sites or just focusing on the main domain that has a lot less content on it.
-
Ideally yes, you would have a separate XML Sitemap file and Google Search Console profile, as well as a separate (and combined) Google Analytics view for each of the thousands of subdomains.
If that is not possible, at the very least you should have a sitemap that includes every indexable page on the primary site, as well as an XML sitemap on each subdomain, which is linked to in that subdomain's own Robots.txt file.
According to the XML sitemaps protocol, XML sitemap files can not contain URLs from different domains. This includes subdomains. You have to keep all URLs to a single domain per XML sitemap. See sitemaps explained for more info.
As for the Title Tag thing, I don't see why you couldn't have a street name AND city -- or neighborhood and city -- so that they would all be unique, while still including the city.
-
Would I have to validate all xx thousands of subdomains in webmaster tools first?
-
Sounds a very tricky one!
You could look to use an index-sitemap.xml so you can list all of your separate subdomain-sitemaps. that way you'll only need to submit one single sitemap.
I would recommend looking at Screaming Frog to help you do this.
Good luck!
-
There is a main site with the store locator that points to an individual franchise location, however nothing can be purchased from that site.
The franchise is very local oriented, that is why every site has a subdomain. Every Franchise can set their own pricing and product options.
-
Hey There!
This does sound like a complex scenario - even a bit of a messy one. Ideally, what the brand would have done here was to build a single website with a single store locator taking users to an appropriate landing page based on the city or zip they type in. The single website would feature a single product menu, accessible to all users regardless of city, removing any risk of creating duplicate product description pages. Something along the lines of how REI.com handles their web presence (you might like to show that to the client).
Instead of taking this approach, am I right in understanding that your client got into this thousands-of-subdomains predicaments in order to provide single-user access to a specific franchisee in a specific city, while not allowing him access to the entire website? Or, for some other reason?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Creating a subdomain for IP targeting based on city
We are currently located in OKC and are opening a new location in Dallas. After much research, I found the best way to do the website is to create a subdomain a redirect people based on their IP location so our current SEO will help give substance to the new location. My question is, should I recreate the whole website under this subdomain using Dallas instead of OKC throughout or should I just recreate 1 or 2 pages? This is all very new to me and I need as much help as I can get lol.
Local Website Optimization | | KylieM0 -
SEO for Franchises - Subdomains or Folders?
Wondering if there ever has been any recent consensus on best SEO strategy for a Franchise. I feel it is safe to assume that just having one corporate website with a "store locator" that just brings up the address, phone and hours of a location is not optimal. Yes, the important thing is to get a Google Places for Business listing for each location so you can come up in the 3-pack and regular Maps result, BUT, the rankings for the 3-pack is largely determined by the site's authority and relevance to the specific search term used, IN ADDITION TO, the proximity of the business to the search user's physical location. Apparently it is widely believed that domain authority does not transfer from www.mycorporatedomain.com to somecity.mycorporatedomain.com. And of course we also know there is a potential for a duplicate content penalty, so you can't just duplicate your main site for a number of locations and change the address and phone number on the contact page. If the products and or services are identical for each location, then it's going to be somewhat ridiculous to try and rewrite many sections of the website since the information is no different despite the location. It seems in general more people are advocates of putting location pages or micro-sites in a subfolder of the corporate domain so that it can benefit from the domain's authority. HOWEVER, it is also widely known that the home page (root URL) of any domain carries more weight in the eyes of Google. So let's assume the best strategy is to create a micro-site where phone and address is different anywhere they appear and the contact page is customized to that location, and the "Meet The Staff" page is customized to that location. The site uses the same style 'template' if you will as the main site. Let's also assume you can build a custom home page that has some different content, but still shares the same look and some of the same information as the main site. But let's say between the different phone, address, and maybe some different images and 20% of the content rewritten a bit, Google doesn't view it as dupe content. So would the best strategy then be to have the location home page be: somecity.mycorporatedomain.com and the product and services pages that are identical to the main site you just use a rel canonical to point to the main site? Or, do you make the "home page" for the local business be a subfolder of the main site. So I guess what it boils down to is whether or not the domain authority has more of an effect compared to having a unique home page on a subdomain. What about this? Say the only thing different on the local site is the contact (phone/address) in the header and/or footer of every page, the contact form page, and the meet the staff page. All other content is identical to the corp site, including the home page. I think in that case you need to use a script to serve the pages dynamically. So you would need to server the pages using a PHP script that detects the subfolder name to determine the location and dynamically replaces the phone and address and server different contact and staff pages. You could have a vanity domain mycity.mycorporatedomain.com that does a 301 redirect to the subfolder home page. (This is all ofcourse assuming the subfolder method is the way to go.)
Local Website Optimization | | SeoJaz0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
What's your opinion on stores with multiple locations around the country that sell the same products?
Is there a way to capture local SEO traffic by only having one website/page for our product pages or do we have to have a website for each location even though the content is identical? We do have a location finder where we list each location. But we want to generate local traffic in the cities we are in to our product pages through SEO, but it's difficult because they all sell the exact same product. We know Google doesn't like duplicate content.
Local Website Optimization | | GrowBrilliant0 -
Client with business website as well as franchise site
I have a client who has created a Weebly web presence alongside his provided franchise website. What is my best strategy as he does not wish for the franchise site to out-perform his Weebly presence.
Local Website Optimization | | Sans_Terra0 -
Subdomain versus Subfolder for Local SEO
Hello Moz World, I'm wanting to know the best practices for utilizing a subdomain versus a subfolder for multi location businesses, i.e. miami.example.com vs. example.com/miami; I would think that that utilizing the subdomain would make more sense for a national organization with many differing locations, while a subfolder would make more sense for a smaller more nearby locations. I wanted to know if anyone has any a/b examples or when it should go one way or another? Thank you, Kristin Miller
Local Website Optimization | | Red_Spot_Interactive0 -
Draft of my new responsive website redesign any opinions?
After a couple of years of talking about having a redesign of my website I finally taken the plunge and I'm paying for responsive design version. Before they go fully into the redesign I thought I would try to get some feedback on whether the initial look and functionality looks good or not. My old website is a very basic Dreamweaver website constructed by myself and because there's only 12 holiday properties on this holiday letting website it has done the job, but you have to scroll down to see the pictures and read the information. At the end of the day I'm certainly not a professional web designer I'm fully aware. With the new very first draft of the responsive design I've asked for functionality where clicking on the property allows you to check out the photographs and information without scrolling. This is the very first stage of a redesign any opinions regarding functionality and initial look would be very gratefully received. New draft website http://www.endeavourcottage.co.uk/newsite/ Old website http://www.endeavourcottage.co.uk/
Local Website Optimization | | WhitbyHolidayCottages2 -
Site was hacked - do I need to change my phone number?
The site I was leasing was hacked about a year ago. I've bought a new domain and changed everything about the site. It is completely separated from the old domain. However, my current phone number is still visible on the old site and I can't get it down. I really don't want to get a new phone number, but the SEO success of my new site is extremely important. Is there anyway the fact that my old phone number on the hacked site could hurt my new site's standing in Google? Thank you so much!
Local Website Optimization | | OptimizationMegan0