Need sitemap opinion on large franchise network with thousands of subdomains
-
Working on a large franchise network with thousands of subdomains. There is the primary corporate domain which basically directs traffic to store locators and then to individual locations. The stores sell essentially the same products with some variations on pricing so lots of pages with the same product descriptions.
Different content
- All the subdomains have their location information address info in the header, footer and geo meta tags on every page.
- Page titles customized with franchise store id numbers.
Duplicate content
- Product description blocks.
Franchisee domains will likely have the ability to add their own content in the future but as of right now most of the content short of the blocks on the pages are duplicated.
Likely limitations -- Adding City to page titles will likely be problematic as there could be multiple franchises in the same city.
Ideally it would be nice if users could search for the store or product and have centers return that are closest to them.
We can turn on sitemaps on all the subdomains and try to submit them to the search engines. Looking for insight regarding submitting all these sites or just focusing on the main domain that has a lot less content on it.
-
Ideally yes, you would have a separate XML Sitemap file and Google Search Console profile, as well as a separate (and combined) Google Analytics view for each of the thousands of subdomains.
If that is not possible, at the very least you should have a sitemap that includes every indexable page on the primary site, as well as an XML sitemap on each subdomain, which is linked to in that subdomain's own Robots.txt file.
According to the XML sitemaps protocol, XML sitemap files can not contain URLs from different domains. This includes subdomains. You have to keep all URLs to a single domain per XML sitemap. See sitemaps explained for more info.
As for the Title Tag thing, I don't see why you couldn't have a street name AND city -- or neighborhood and city -- so that they would all be unique, while still including the city.
-
Would I have to validate all xx thousands of subdomains in webmaster tools first?
-
Sounds a very tricky one!
You could look to use an index-sitemap.xml so you can list all of your separate subdomain-sitemaps. that way you'll only need to submit one single sitemap.
I would recommend looking at Screaming Frog to help you do this.
Good luck!
-
There is a main site with the store locator that points to an individual franchise location, however nothing can be purchased from that site.
The franchise is very local oriented, that is why every site has a subdomain. Every Franchise can set their own pricing and product options.
-
Hey There!
This does sound like a complex scenario - even a bit of a messy one. Ideally, what the brand would have done here was to build a single website with a single store locator taking users to an appropriate landing page based on the city or zip they type in. The single website would feature a single product menu, accessible to all users regardless of city, removing any risk of creating duplicate product description pages. Something along the lines of how REI.com handles their web presence (you might like to show that to the client).
Instead of taking this approach, am I right in understanding that your client got into this thousands-of-subdomains predicaments in order to provide single-user access to a specific franchisee in a specific city, while not allowing him access to the entire website? Or, for some other reason?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International subdirectory without localized content - best practice / need advice
Hi there, Our site uses a subdirectory for regional and multilingual sites as show below for 200+ countries.
Local Website Optimization | | erinfalwell
EX: /en_US/ All sites have ~the same content & are in English. We have hreflang tags but still have crawl issues. Is there another URL structure you would recommend? Are there any other ways to avoid the duplicate page & crawl budget issues outside of the hreflang tag? Appreciate it!0 -
Differentiating Franchise Location Names to better optimize locations
Hello All, I am currently spear heading SEO for a national franchise. I am coming across locations in the same city and zip code. I'm definitely finding difficulties in naming the location in a way that will be specific to the franchise locations (locations are 1 mile away from each other). I am looking to apply geo specific location names for each center regardless of local city terms. (e.g. Apexnetwork of north madronna, Apexnetwork of south madronna) Also, building the website and location to read (apexnetwork.com/north-madronna….. apexnetwork.com/south-madronna) While encouraging the client to continue using the geo specific terms while writing blogs. Is this best practice? Any feedback would help.
Local Website Optimization | | Jeffvertus0 -
Subdomain vs. Separate Domain for SEO & Google AdWords
We have a client who carries 4 product lines from different manufacturers under a singular domain name (www.companyname.com), and last fall, one of their manufacturers indicated that they needed to move to separate out one of those product lines from the rest, so we redesigned and relaunched as two separate sites - www.companyname.com and www.companynameseparateproduct.com (a newly-purchased domain). Since that time, their manufacturer has reneged their requirement to separate the product lines, but the client has been running both sites separately since they launched at the beginning of December 2016. Since that time, they have cannibalized their content strategy (effective February 2017) and hacked apart their PPC budget from both sites (effective April 2017), and are upset that their organic and paid traffic has correspondingly dropped from the original domain, and that the new domain hasn't continued to grow at the rate they would like it to (we did warn them, and they made the decision to move forward with the changes anyway). This past week, they decided to hire an in-house marketing manager, who is insisting that we move the newer domain (www.companynameseparateproduct.com) to become a subdomain on their original site (separateproduct.companyname.com). Our team has argued that making this change back 6 months into the life of the new site will hurt their SEO (especially if we have to 301 redirect all of the old content back again, without any new content regularly being added), which was corroborated with this article. We'd also have to kill the separate AdWords account and quality score associated with the ads in that account to move them back. We're currently looking for any extra insight or literature that we might be able to find that helps explain this to the client better - even if it is a little technical. (We're also open to finding out if this method of thinking is incorrect if things have changed!)
Local Website Optimization | | mkbeesto0 -
Need Help - Google has picked up an overseas company with the same name and put it in search on the right
Hi All, Google has picked up a competitors logo from overseas (same name) and input it with the wikipedia excerpt on the right hand side of search. What the heck can I do to get this removed as its a serious legal/brand issue. See URL - http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu Hope someone can help !! Cheers Dave http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu
Local Website Optimization | | CFCU0 -
Server response time: restructure the site or create the new one? SEO opinions needed.
Hi everyone, The internal structure of our existing site increase server response time (6 sec) which is way below Google 0.2sec standards and also make prospects leave the site before it's loaded. Now we have two options (same price): restructure the site's modules, panels etc create new site (recommended by developers)
Local Website Optimization | | Ryan_V
Both options will extend the same design and functionality. I just wanted to know which option SEO community will recommend?0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Omitted Results city-queries for the same brand on different subdomains?
I've noticed on a few occasions where two subdomains share the same brand and are also attempting to rank for phrases specific to one city - the stronger subdomain tends to send the other subdomain to the "omitted search results" for those city specific queries. The subdomains do tend to have some duplicate content that they share but if the two pages on the different subdomains are unique for the search phrase in question wouldn't Google choose to surface both results? Or is this a question of domain diversity in the SERPs where the 2 results would just be too similar since they share the same root domain and have topically similar content? I've seen cases where they can share the first page of results but more often than not it seems that one is sent to the "omitted results". Any thoughts on strategy in this situation? The companies being described end up wanting to rank for the same city because they both serve a portion of the city in case anyone is wondering.
Local Website Optimization | | GSO0 -
Will subdomains with duplicate content hurt my SEO? (solutions to ranking in different areas)
My client has offices in various areas of the US, and we are working to have each location/area rank well in their specific geographical location. For example, the client has offices in Chicago, Atlanta, Dallas & St Louis. Would it be best to: Set up the site structure to have an individual page devoted to each location/area so there's unique content relevant to that particular office? This keeps everything under the same, universal domain & would allow us to tailor the content & all SEO components towards Chicago (or other location). ( example.com/chicago-office/ ; example.com/atlanta-office/ ; example.com/dallas-office/ ; etc. ) Set up subdomains for each location/area...using the basically the same content (due to same service, just different location)? But not sure if search engines consider this duplicate content from the same user...thus penalizing us. Furthermore, even if the subdomains are considered different users...what do search engines think of the duplicate content? ( chicago.example.com ; atlanta.example.com ; dallas.example.com ; etc. ) 3) Set up subdomains for each location/area...and draft unique content on each subdomain so search engines don't penalize the subdomains' pages for duplicate content? Does separating the site into subdomains dilute the overall site's quality score? Can anyone provide any thoughts on this subject? Are there any other solutions anyone would suggest?
Local Website Optimization | | SearchParty0