Need sitemap opinion on large franchise network with thousands of subdomains
-
Working on a large franchise network with thousands of subdomains. There is the primary corporate domain which basically directs traffic to store locators and then to individual locations. The stores sell essentially the same products with some variations on pricing so lots of pages with the same product descriptions.
Different content
- All the subdomains have their location information address info in the header, footer and geo meta tags on every page.
- Page titles customized with franchise store id numbers.
Duplicate content
- Product description blocks.
Franchisee domains will likely have the ability to add their own content in the future but as of right now most of the content short of the blocks on the pages are duplicated.
Likely limitations -- Adding City to page titles will likely be problematic as there could be multiple franchises in the same city.
Ideally it would be nice if users could search for the store or product and have centers return that are closest to them.
We can turn on sitemaps on all the subdomains and try to submit them to the search engines. Looking for insight regarding submitting all these sites or just focusing on the main domain that has a lot less content on it.
-
Ideally yes, you would have a separate XML Sitemap file and Google Search Console profile, as well as a separate (and combined) Google Analytics view for each of the thousands of subdomains.
If that is not possible, at the very least you should have a sitemap that includes every indexable page on the primary site, as well as an XML sitemap on each subdomain, which is linked to in that subdomain's own Robots.txt file.
According to the XML sitemaps protocol, XML sitemap files can not contain URLs from different domains. This includes subdomains. You have to keep all URLs to a single domain per XML sitemap. See sitemaps explained for more info.
As for the Title Tag thing, I don't see why you couldn't have a street name AND city -- or neighborhood and city -- so that they would all be unique, while still including the city.
-
Would I have to validate all xx thousands of subdomains in webmaster tools first?
-
Sounds a very tricky one!
You could look to use an index-sitemap.xml so you can list all of your separate subdomain-sitemaps. that way you'll only need to submit one single sitemap.
I would recommend looking at Screaming Frog to help you do this.
Good luck!
-
There is a main site with the store locator that points to an individual franchise location, however nothing can be purchased from that site.
The franchise is very local oriented, that is why every site has a subdomain. Every Franchise can set their own pricing and product options.
-
Hey There!
This does sound like a complex scenario - even a bit of a messy one. Ideally, what the brand would have done here was to build a single website with a single store locator taking users to an appropriate landing page based on the city or zip they type in. The single website would feature a single product menu, accessible to all users regardless of city, removing any risk of creating duplicate product description pages. Something along the lines of how REI.com handles their web presence (you might like to show that to the client).
Instead of taking this approach, am I right in understanding that your client got into this thousands-of-subdomains predicaments in order to provide single-user access to a specific franchisee in a specific city, while not allowing him access to the entire website? Or, for some other reason?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International subdirectory without localized content - best practice / need advice
Hi there, Our site uses a subdirectory for regional and multilingual sites as show below for 200+ countries.
Local Website Optimization | | erinfalwell
EX: /en_US/ All sites have ~the same content & are in English. We have hreflang tags but still have crawl issues. Is there another URL structure you would recommend? Are there any other ways to avoid the duplicate page & crawl budget issues outside of the hreflang tag? Appreciate it!0 -
SEO for Franchises - Subdomains or Folders?
Wondering if there ever has been any recent consensus on best SEO strategy for a Franchise. I feel it is safe to assume that just having one corporate website with a "store locator" that just brings up the address, phone and hours of a location is not optimal. Yes, the important thing is to get a Google Places for Business listing for each location so you can come up in the 3-pack and regular Maps result, BUT, the rankings for the 3-pack is largely determined by the site's authority and relevance to the specific search term used, IN ADDITION TO, the proximity of the business to the search user's physical location. Apparently it is widely believed that domain authority does not transfer from www.mycorporatedomain.com to somecity.mycorporatedomain.com. And of course we also know there is a potential for a duplicate content penalty, so you can't just duplicate your main site for a number of locations and change the address and phone number on the contact page. If the products and or services are identical for each location, then it's going to be somewhat ridiculous to try and rewrite many sections of the website since the information is no different despite the location. It seems in general more people are advocates of putting location pages or micro-sites in a subfolder of the corporate domain so that it can benefit from the domain's authority. HOWEVER, it is also widely known that the home page (root URL) of any domain carries more weight in the eyes of Google. So let's assume the best strategy is to create a micro-site where phone and address is different anywhere they appear and the contact page is customized to that location, and the "Meet The Staff" page is customized to that location. The site uses the same style 'template' if you will as the main site. Let's also assume you can build a custom home page that has some different content, but still shares the same look and some of the same information as the main site. But let's say between the different phone, address, and maybe some different images and 20% of the content rewritten a bit, Google doesn't view it as dupe content. So would the best strategy then be to have the location home page be: somecity.mycorporatedomain.com and the product and services pages that are identical to the main site you just use a rel canonical to point to the main site? Or, do you make the "home page" for the local business be a subfolder of the main site. So I guess what it boils down to is whether or not the domain authority has more of an effect compared to having a unique home page on a subdomain. What about this? Say the only thing different on the local site is the contact (phone/address) in the header and/or footer of every page, the contact form page, and the meet the staff page. All other content is identical to the corp site, including the home page. I think in that case you need to use a script to serve the pages dynamically. So you would need to server the pages using a PHP script that detects the subfolder name to determine the location and dynamically replaces the phone and address and server different contact and staff pages. You could have a vanity domain mycity.mycorporatedomain.com that does a 301 redirect to the subfolder home page. (This is all ofcourse assuming the subfolder method is the way to go.)
Local Website Optimization | | SeoJaz0 -
Static XML Sitemap
I performed a change of address for one of our sites to go to a new domain. In the process we left out the submission of the old site's sitemap at the new property in google webmaster console and realize now that we need to do this step. The old site has all these domains still getting indexed: https://www.google.com/#q=site:citychurchfamily.org . I believe that I should be making a static xml sitemap file, upload it to the new domain's root directory, and then test/submit it to google on the new domain's GWM property. Question, should the xml sitemap contain entries for all the old domain's links that are currently still being indexed and what is the fastest way to generate this sitemap? Any insight is greatly appreciated.
Local Website Optimization | | a_toohill1 -
Do I need to change my country og:locale to en_AE
Hi MOZ, I have a site that is aimed at the English speaking market of the United Arab Emirates. The language tag is currently set to lang="en-GB" and the og:locale also set to en_GB. The domain is a .com and aimed at the whole world. Should I be trying to target en-AE and en_AE for these tags instead of GB?
Local Website Optimization | | SeoSheikh0 -
Franchise Content Spinning
Hey Guys, Thanks for taking the time out to read my question, I appreciate it. I know Google doesn't treat all duplicate content the same, but what about this scenario. We have a garage door company franchise that services Seattle, San Diego, & Salt Lake City. It is the same brand, but each area has a different website, catering to their own county. Say I write & post a blog about "how to maintain your garage door" to the Seattle site. This is certainly useful for the other locations as well. So would I get penalized for posting the same article to San Diego & Salt Lake City without massively changing the content to avoid duplication? Or should I dedicate the extra time to revamp the content and avoid duplication? Does Google care about this type of duplication? Thanks in advance!!
Local Website Optimization | | dwayne.jones260 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Need Help: Trouble With Website and Analytics
Hey all, I have a client who I have been having the WORST time getting traffic and ranked for relevant keywords. I've tried so many things and have yet to see much progress after about 9 months. Site is mgmcdallas.com. I realized something REALLYY weird with this site a couple weeks ago. The business has a Dallas, TX address and really only services the Dallas/Fort Worth metro area. They recently started getting some of referral traffic from yelp.com/biz_redir. Weirdly, they've also been getting more sales calls and more salespeople filling out their contact form. Take January for example, they had 164 sessions and 119 of those were from this yelp referral. They DON'T advertise with Yelp, or get traffic from Yelp anywhere in Texas. You can see from below screenshot that they are all coming from California. analytics I've had our <acronym title="Search Engine Optimization">SEO</acronym> and developer look into and we can't figure out what's happening. Any thoughts? iubnZdu
Local Website Optimization | | BWrightTLM0 -
Need to access robots.txt to block tags
Hi My website nykb.com is showing up in moz as having multiple duplicate pages because of the tags (each tag generates its own page and since posts have many tags but the same tags are only used once/twice the tag pages are all duplicate pages. I wanted to block the tagpages in robots.txt but cant seem to find access to it- have searched online but havent come up with anything! I do not have access to the ftp folders only the wordpress backend.. should I just remove tags? the posts are grouped by category too.. THANKS
Local Website Optimization | | henya0