Creating 20+ websites with links back to central site
-
Hey guys,
A client of ours owns an IT company with 20+ locations across the UK. He is looking for a solution to provide each of their 20+ locations with a page or website that they can manage themselves that links directly back to the main site.
His idea is to create 20+ one or two page websites that could all link back to the main central site - aiding the possibility of ranking well for locally-based terms.
At the moment, we have a page for each of the 20+ locations on the main site. However, the client wants to give his franchisees complete control over their web presence.
Would a setup like this work? Would it be logical to have 20+ websites (likely to follow a very similar format) all pointing to one central website? Would we have to "no-follow" links back to main site in order to show we aren't trying to manipulate page rank?
Would creating sub folders on the main site be a better option for each of the 20+ locations?
Any feedback appreciated!
-
I would agree also. Creating Sub-Folders is the best way to go.
I would add just a few quick points:
1. Include schema.org geocoordates to your pages: http://schema.org/GeoCoordinates
2. Sign up for Yelp if it is applicable for your client's business locations.
3. Make sure your Meta Description for each page mentions a city/town name.
4. Pretty simple one but more related to the site overall - Setting Geographical targeting in Google webmaster
5. Sign up for Google+ Local (mentioned above), Bing Places and Yahoo Local
6. If your business is related to Yelp, your most likely able to handle other review sites. Submit the business to reviews sites which will help generate more exposure for the brand. The reviews can also be used in Google - increase CTR% (if you have good reviews)Good Luck!
-
Agreed. Building out sub-folders is a better strategy in the long-run because your client will be gaining more authority on their main domain as a result. A tip for those landing pages would be to embed a google map with the complete business address along with links to local resources.
-
Creating sub folders is better as you do not have to worry about your 20 web sites looking like some sort of link farm. Plus, you build the overall brand with the main website. That said, you do not want 20 identical pages for each location on the main website. You would want to have unique and original information on each location page about that location, who works there, what services do they provide, etc etc.
If you want to give the client more control, why not setup each of the location pages so that a location could login and update the information. It would be just like you can update your Google+ Local profile, you could even setup a login etc.
That said, if you give the client control of the listing/page/website then you run into the issue that client will often do a poor job of providing good information, and/or mess up your SEO if you are trying to get those pages ranked.
I would suggest a hybrid solution where you setup the pages for each location, even interview each location and gather up the information that is needed to really make those location pages information rich. You can then take input from that location and build your pages with that information. If there are some small edits or updates that a location needs, you can make those updates (or not) as you would still maintain editorial and SEO control.
I have managed a site with thousands of locations and we found that use of the folders worked really well. We actually gave users access to update location profiles, but often they would put information that was frankly, poorly written. Throw out all the SEO points, some of these "self edited" location profiles did not make me want to visit that location as the copy was so poor. It was not until we took more control of the content on location pages that we were able to get a good balance between original content from the locations and well written page with an eye to SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our parent company has included their sitemap links in our robots.txt file - will that have an impact on the way our site is crawled?
Our parent company has included their sitemap links in our robots.txt file. All of their sitemap links are on a different domain and I'm wondering if this will have any impact on our searchability or potential rankings.
Intermediate & Advanced SEO | | tsmith1310 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
PDF on financial site that duplicates ~50% of site content
I have a financial advisor client who has a downloadable PDF on his site that contains about 9 pages of good info. Problem is much of the content can also be found on individual pages of his site. Is it best to noindex/follow the pdf? It would be great to let the few pages of original content be crawlable, but I'm concerned about the duplicate content aspect. Thanks --
Intermediate & Advanced SEO | | 540SEO0 -
10,000+ links from one site per URL--is this hurting us?
We manage content for a partner site, and since much of their content is similar to ours, we canonicalized their content to ours. As a result, some URLs have anything from 1,000,000 inbound links / URL to 10,000+ links / URL --all from the same domain. We've noticed a 10% decline in traffic since this showed up in our webmasters account & were wondering if we should nofollow these links?
Intermediate & Advanced SEO | | nicole.healthline0 -
How does a competing website with clearly black hat style SEO tactics, have a far higher domain authority than our website that only uses legitimate link building tactics?
Through SEO Moz link analysis tools, we looked at a competing websites external followed links and discovered a large number of links going to Blog pages with domain authorities in the 90's (their blog page authorities were between 40 and 60), however the single blog post written by this website was exactly the same in every instance and had been posted in August 2011. Some of these blog sites had 160 or so links linking back to this competing website whose domain authority is 49 while ours is 28, their Moz Trust is 5.43 while ours is 5.18. An example of some of the blogs that link to the competing website are: http://advocacy.mit.edu/coulter/blog/?p=13 http://pest-control-termite-inspection.posterous.com/\ However many of these links are "no follow" and yet still show up on Open Site Explorer as some of this competing websites top linking pages. Admittedly, they have 584 linking root domains while we have only 35, but if most of them are the kind of websites posted above, we don't understand how Google is rewarding them with a higher domain authority. Our website is www.anteater.com.au Are these tactics now the only way to get ahead?
Intermediate & Advanced SEO | | Peter.Huxley590 -
How to place two NADs on site (One website, 2 locations)
Hello, For our site: nlpca(dot)com we have 2 locations. One location is based out of a hotel in California, and one location is where we have our offices in Utah. Our site is about both locations, emphisizing California. Do we need to create a Utah page and put the Utah NAD on that page with separate address and phone number? What do we use as an address since we only have a hotel room in California now? What do we need to do to rank for both in the natural and also Places listings? Right now we're #1 for NLP California and #4 for NLP Utah Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Redirect micro-niche site to bigger niche site?
I have a micro niche site that performs reasonably well (page 1 at least) for it's main keywords. It is an exact match domain. To save the ongoing maintenance of a site that gets less than 10 visitors a day, I was thinking of redirecting this micro niche site to a bigger site (a niche site that the micro niche fits into, if that makes sense!) Would I lose rankings because of the power that the EMD provided? Would it be better keeping it there for the backlink it provides to the bigger site (although on the same C Class IP)
Intermediate & Advanced SEO | | BigMiniMan0 -
How do you prevent the mobile site becoming a duplicate of the full browser site?
We have a larger site with 100k+ pages, we need to create a mobile site which gets indexed in the mobile engines but I am afraid that google bot will consider these pages duplicates of the normal site pages. I know I can block it on the robots.txt but I still need it to be indexed for mobile search engines and I think google has a mobile crawler as well. Feel free to give me any other tips that I should follow while trying to optimize the mobile version. Any help would be appreciated 🙂
Intermediate & Advanced SEO | | pulseseo0