I have a general site for my insurance agency. Should I create niche sites too?
-
I work with several insurance agencies and I get this questions several times each month. Most agencies offer personal and business insurance and in a certain geographic location.
I recommend creating a quality general agency site but would they have more success creating other nice sites as well? For example, a niche site about home insurance and one about auto insurance.
What would your recommendation be?
-
I would have to agree. If you keep to a single domain, you don't have to spread your budget and effort out between other domains. Any piece of content you create can then, possibly, influence your entire domain.
-
Highly agree with Matt! Creating many other websites will divide your efforts and you might not be able to achieve what you can within a single website.
When all the linking will be under one domain the domain authority as a whole will increase, which will help the sub pages to come up from the desired key phrases.
When you will be working on the promotion and branding side, it will help you get tons of words out and which help you get natural links from verity of websites to single domain and branding will go high (online and offline)
Talking from my experience, Insurance in not at all an easy industry so even making separate websites will require lots and lots of work so my idea is to have sub pages under one domain so that all your efforts points to one domain and sub pages can win the business accordingly.
-
I agree with Matt. Build great content and Authority on one domain name and focus your strength and efforts there. Don't divide them too much. That way all the efforts you do within this site, helps and complements each and every page on the website. Works better long term.
-
I would probably recommend not sites, but landing pages.
yourinsurancesite.com/business
This way you keep the bulk of the SEO on one domain (as opposed to subdomains or niche site domains). You also stay with one login for all edits, etc. which helps streamline. Then, you can easily run campaigns to these main subfolders and track analytics per type of ins. more accurately.
I would say subfolders per agency would be the easiest and most logical SEO solution. There will be situations where this isn't necessarily the best (very big companies can usually afford to do proper SEO more than one domain and then having more domains can benefit you in the long run.) But for most insurance agencies and this type of sub-agency, I would think subfolders would be best. One of my best friends runs his own State Farm agency and they run it similarly.
http://www.statefarm.com/ agent/US/STATE/TOWN/AGENT-NAME-UID
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links from a penalised site.
Hey Mozzers, Recently we have had a series of agencies in to pitch for work, one group mentioned that due to our association with a possibly penalised product review website, any links and activity associated with the brand would hinder our SEO. We currently have a good rating, but we are now no longer pushing our customers to the site as we move to a new platform. The current link back from this website is also no-followed. Any thoughts on how this could impact us? And how the agencies determined the site was penalised and causing us problems. Cheers Tim
Intermediate & Advanced SEO | | TimHolmes0 -
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
Site: inurl: Search
I have a site that allows for multiple filter options and some of these URL's have these have been indexed. I am in the process of adding the noindex, nofollow meta tag to these pages but I want to have an idea of how many of these URL's have been indexed so I can monitor when these have been re crawled and dropped. The structure for these URL's is: http://www.example.co.uk/category/women/shopby/brand1--brand2.html The unique identifier for the multiple filtered URL's is --, however I've tried using site:example.co.uk inurl:-- but this doesn't seem to work. I have also tried using regex but still no success. I was wondering if there is a way around this so I can get a rough idea of how many of these URL's have been indexed? Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Sitemap for SmartPhone site
Hello I have a smartphone site (e.g.m.abc.com). To my understanding we do not need a mobile sitemap as its not a traditional mobile site. Shall I add those mobile site links in my regular www XML sitemap or not bother to add the links as we already have rel = canonical (on m.abc.com ) and rel= alternate in place (on www site) to respective pages. Please suggests a solution. I really look forward to an answer as I haven't found the "official" answer to this question anywhere.
Intermediate & Advanced SEO | | AdobeVAS0 -
Merging Sites: Will redirecting the old homepage to an internal page on the new site cause issues?
I've ended up with two sites which have similar content (but not duplicate) and target similar keywords, rather than trying to maintain two sites I would like to merge the sites together. The old site is more of a traditional niche site and targets a particular set of keywords on its homepage, the new site is more of an authority site with a magazine type homepage and targets the same set of keywords from an internal page. My question is: Should I redirect the old site's homepage to the relevant internal page on the new website...
Intermediate & Advanced SEO | | lara_dar
...or should I redirect the old site's homepage to the new site's homepage? (the old site's homepage backlinks are a mixture of partial match keyword anchor text, naked URLs and branded anchor text) I am in two minds (a & b!) (a) Redirecting to the internal page would be great for ranking as there are some decent backlinks and the content is similar (b) But usually when you do a 301 redirect the homepage usually directs to the new homepage and some of the old site's links are related to the domain rather than the keyword (e.g. http://www.site.com) and some people will be looking for the site's homepage. What do you think? Your help is much appreciated (and hope this makes sense...!)0 -
Depth of Links on Ecommerce Site
Hi, In my sitemap, I have the preferred entrance pages and URL's of categories and subcategories. But I would like to know more about how Googlebot and other spiders see a site - e.g. - what is classed as a deep link? I am using Screaming Frog SEO spider, and it has a metric called level on it - and this represents how deep or how many clicks away this content is.. but I don't know if that is how Googlebot would see it - From what Screaming Frog SEO spider software says, each move horizontally across from Navigation is another level which visually doesnt make sense to me? Also, in my sitemap, I list the URL's of all the products, there are no levels within the sitemap. Should I be concerned about this? Thanks, B
Intermediate & Advanced SEO | | bjs20100 -
Site Transfer and Downtime
If I want to transfer my website from yahoo to another web host without having any down time how would I do that?
Intermediate & Advanced SEO | | bronxpad0 -
Changing Site URLs
I am working on a new client that hasn't implemented any SEO previously. The site has terrible url nomenclature and I am wondering if it is worth it to try and change it. Will I lose rankings? What is the best url naming structure? Here's the website http://www.formica.com/en/home/TradeLanding.aspx. (I am only working on the North America site.) Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0