Creating 20+ websites with links back to central site
-
Hey guys,
A client of ours owns an IT company with 20+ locations across the UK. He is looking for a solution to provide each of their 20+ locations with a page or website that they can manage themselves that links directly back to the main site.
His idea is to create 20+ one or two page websites that could all link back to the main central site - aiding the possibility of ranking well for locally-based terms.
At the moment, we have a page for each of the 20+ locations on the main site. However, the client wants to give his franchisees complete control over their web presence.
Would a setup like this work? Would it be logical to have 20+ websites (likely to follow a very similar format) all pointing to one central website? Would we have to "no-follow" links back to main site in order to show we aren't trying to manipulate page rank?
Would creating sub folders on the main site be a better option for each of the 20+ locations?
Any feedback appreciated!
-
I would agree also. Creating Sub-Folders is the best way to go.
I would add just a few quick points:
1. Include schema.org geocoordates to your pages: http://schema.org/GeoCoordinates
2. Sign up for Yelp if it is applicable for your client's business locations.
3. Make sure your Meta Description for each page mentions a city/town name.
4. Pretty simple one but more related to the site overall - Setting Geographical targeting in Google webmaster
5. Sign up for Google+ Local (mentioned above), Bing Places and Yahoo Local
6. If your business is related to Yelp, your most likely able to handle other review sites. Submit the business to reviews sites which will help generate more exposure for the brand. The reviews can also be used in Google - increase CTR% (if you have good reviews)Good Luck!
-
Agreed. Building out sub-folders is a better strategy in the long-run because your client will be gaining more authority on their main domain as a result. A tip for those landing pages would be to embed a google map with the complete business address along with links to local resources.
-
Creating sub folders is better as you do not have to worry about your 20 web sites looking like some sort of link farm. Plus, you build the overall brand with the main website. That said, you do not want 20 identical pages for each location on the main website. You would want to have unique and original information on each location page about that location, who works there, what services do they provide, etc etc.
If you want to give the client more control, why not setup each of the location pages so that a location could login and update the information. It would be just like you can update your Google+ Local profile, you could even setup a login etc.
That said, if you give the client control of the listing/page/website then you run into the issue that client will often do a poor job of providing good information, and/or mess up your SEO if you are trying to get those pages ranked.
I would suggest a hybrid solution where you setup the pages for each location, even interview each location and gather up the information that is needed to really make those location pages information rich. You can then take input from that location and build your pages with that information. If there are some small edits or updates that a location needs, you can make those updates (or not) as you would still maintain editorial and SEO control.
I have managed a site with thousands of locations and we found that use of the folders worked really well. We actually gave users access to update location profiles, but often they would put information that was frankly, poorly written. Throw out all the SEO points, some of these "self edited" location profiles did not make me want to visit that location as the copy was so poor. It was not until we took more control of the content on location pages that we were able to get a good balance between original content from the locations and well written page with an eye to SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing Toxic Back Links Targeting Obscure URL or Image
There are 2 or 3 URLs and one image file that dozens of toxic domains are linking to on our website. Some of these pages have hundreds of links from 4-5 domains. Rather than disavowing these links, would it make sense to simply break these links, change the URL that the link to and not create a redirect? It seems like this would be a sure fire way to get rid of these links. Any downside to this approach? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan1 -
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
B2B site targeting 20,000 companies with 20,000 dedicated "target company pages" on own website.
An energy company I'm working with has decided to target 20,000 odd companies on their own b2b website, by producing a new dedicated page per target company on their website - each page including unique copy and a sales proposition (20,000 odd new pages to optimize! Yikes!). I've never come across such an approach before... what might be the SEO pitfalls (other than that's a helluva number of pages to optimize!). Any thoughts would be very welcome.
Intermediate & Advanced SEO | | McTaggart0 -
What things, that we might overlook, help retain link juice on the site?
Since subscribing to Moz, I have been focussing alot on some of the more technical aspects of SEO. The current thing I am finding interesting is stopping link juice leaks. Here are a selection of some of the things I have done: I have cloaked my affiliate links - see http://yoast.com/cloak-affiliate-links/ Removed some html coded social share links within the theme, and replaced with javascript plugin (http://wordpress.org/plugins/flare/) Used the Moz toolbar to view as Google, to see what google is seeing. Removed some meta links at the bottom of blog posts (author etc) that were duplicated. Now, I don't intend to go over the top with this, as links to social accounts on each page are there to encourage engagement etc, but are there any things you may have come across \ tips that people may have overlooked but perhaps should look out for? As example as some of the things that might be interesting to discuss: Are too many tags, categories bad? Do you index your tag, date archive pages? Does it matter?
Intermediate & Advanced SEO | | Jonathan19790 -
Will an inbound follow link on a site be devalued by an inbound affiliate link on the same site?
Hey guys, quick question I didn't find an answer to online. Scenario: 1. Site A links to Site B. It's a natural, regular, follow-link 2. Site A joins Site B's affiliate program, and adds an affiliate link Question: Does the first, regular follow link get devalued by the second affiliate link? Cheers!
Intermediate & Advanced SEO | | ipancake0 -
How to create XML sitemap for larger website?
We need to create XML sitemap for a website that has more than 2 million pages. Please suggest me the best software to create XML sitemap for the website. Since there are different strategies that larger websites submit sitemaps, let me know the best way to submit this sitemap for website of this size. Or Is there any tool provided by SEOmoz for XML sitemap generation for larger websites?
Intermediate & Advanced SEO | | DCISEO0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Should I link my similar sites together?
Hi I currently have two sites within exactly the same market. I've just purchased a third website from someone. Should I link these sites together? (i.e. in the page header should I cross link them or point two of them to the third?) If I do this will it harm them if they are on the same C-Class IP blocks? Is using private domains and different hosting companies considered dodgey in any way? Basically I'm a big wimp and don't want to do anything potentially that might potentially hurt my rankings;)
Intermediate & Advanced SEO | | Blendfish0