Categories in Places Vs Local
-
Say you are listed with both Google places and Google Local. Places still allows custom categories, while Local limits you to preset categories. Which is the better strategy: to build service pages following custom services available in Places, or build out service pages following the (allowed) preset categories in Local.
-
Hi Miriam,
I think budget issues are always a consideration. I'm with you: if the owner doesn't get that this is a long term effort to increase their business, then I tell them I can't help them.
I tell them that my job is to increase their business and that I need a commitment of time and money to do so. I generally try to get them to commit for a one year period and a budget large enough to actually accomplish something.
These kinds of customers are harder to find, but, once I do, because I have the time and money to get a result, they tend to become more or less permanent customers.
My biggest challenge is explaining what I'm going to do and how this will result in more business.
-
Hi Wayne,
So glad to help! Thanks for suggesting a post on this topic - I'll think this over!
What I have experienced with customer resistance has typically revolved around budget. I've worked with a lot of very small local businesses over the years and they are concerned about being able to afford content development (particularly when they don't have the skills/resources to create it themselves). Sometimes, I have been able to offer an austerity plan to a local business owner - something like just one new page per month once the basic website has been built. Typically they can afford this. In a year, they've added 12 new pages.
This is nothing like some of the Local SEO projects going on out there where people are building hundreds of pages, of course, but it is at least some progress. The hope is that if the company starts getting more business from even a small web presence, they will eventually be able to afford to invest more in things like content development. But I do think there is a threshold for every business. If the owner can't make even a modest investment, he's not going to be able to utilize the web to promote his company. I've had to turn business owners away who just don't get this.
How about you, Wayne? Do you have clients with budget issues or does resistance to growth stem from something else?
-
Hi Miriam,
Great answer. I especially like your differentiation on "is" and "does" and agree 100% about the importance of mapping content to research and trying to cover as many bases as you can.
I've long been involved with national campaigns, but this is my first local job. I knew the categories would be crucial, but wasn't sure about the best approach. With your roadmap, I know exactly what to do.
You should consider a post on this issue. This is a crucial issue for local, and I bet a lot of people aren't getting this.
How do you deal with customer resistance to expanded/ongoing content?
-
Hi Wayne,
That's what I thought. Here's my take on this. There are two distinct types of keywords you can optimize a local business website for. I think of these as 'is' and 'does'.
Examples of 'Is' Keywords:
Plumber
Arborist
Dog Walker
Examples of 'Does' Keywords
Plumbing
Tree Trimming
Dog Walking
In other words, one set says what a business 'is', the other set says what a business 'does'. Google has always wanted categories, whether custom created or chosen from their pre-set taxonomy, to reflect what a business is - never what it does. In optimizing a local business' website, it is certainly important to include these kinds of 'is' keywords in your work, but you shouldn't limit yourself to this. You'll want to include relevant 'does' keywords, too, because people certainly search both ways.
Keyword research will, of course, be vital to determining which kinds of 'is' keywords are being searched for most for a given business. I consider that Google's pre-set category taxonomy (the only one that will be available to any business once all listings have transitioned to the new Places for Business dashboard) gives us certain clues about how Google understands and organizes types of business. Since we care so much about Google rankings, we need to pay attention to what Google is signalling to us and incorporate these very big hints into our optimization. But we need to build additional content that reflects all the other findings of our kw research, there will typically be lots of 'does' terms in there.
This line of reasoning is why I have to be very frank with low-budget clients who come to me saying that they only want a one-page website, or a five-page website. There are so many keywords we should ideally be optimizing for, and there's no way to cram them all into a tiny site. Pretty much every local business, even if they have to start small, should have a plan for content development that will grow the size of their website to eventually include lots of different keywords. When you start small, though, Google's pre-set categories definitely need to be given consideration, in addition to the most searched terms discovered in your keyword research.
Long answer! But, you've asked a very good question, Wayne, and I wanted to give you a big-picture view of this, based on what I've experienced. Hope this helps!
-
Linda, I thought your answer did a good job of explaining certain facets of the two dashes, even if it wasn't quite what Wayne was asking about
-
Hi Miriam,
That's exactly what I'm asking: should I build out my website service pages as per the custom categories available in Places or per preset categories available in + Local?
-
So sorry I was rushing to prep for a webinar. Think I totally misread the question, so please ignore my answer.
Somebody give me a big thumbs down. ;-(
-
Hi Wayne,
You've received some thoughtful replies here, but I'm not sure your question has been answered you. You write:
"Which is the better strategy: to build service pages following custom services available in Places, or build out service pages following the (allowed) preset categories in Local."
I believe you may be talking about building out service pages on a website, not about altering a Google+ Local page. Is this correct? Are you asking whether your website-based service landing pages should be optimized to reflect the old custom category system or the new Google+ Local pre-set category taxonomy? Could you please clarify. I want to be certain you receive a helpful answer.
-
If by local you mean you've merged with G+, then FYI after you merge with G+ you are not supposed to edit anything in the old Places dash any more at all anyway, so there is really no "which is better" strategy. You are limited by which dash you are in and whether you have merged or not (which if you have not do not BTW.)
Plus custom cats are going away soon. Google is upgrading everyone to the new Places dash, which like Google+ does not allow custom cats. THEN they are auto-upgrading everyone to G+ so everyone will have a merged page, which again does not allow custom cats.
So both the old dash and custom cats will be gone soon. Sad...
-
I notice that competitors who set up places accounts before Local (which I did not have the opportunity to do) do better in search for subcategories of specialties that aren't listed in the preset categories. I'm sure Google will eventually deal with this, but I would say if you have the opportunity to include categories that aren't on the preset list, it may give you a leg up.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using Temporary Redirects for Ecommerce Categories?
Hi I'm currently going over old category pages for an ecommerce website, and I'm trying to work out the best way of handling old pages. I will be setting up 301s for the majority of pages, as these are 100% dead and gone. I'm struggling a bit with certain pages though, whereby the category is empty but there is always the possibility that our buyers will purchase these products again in the future (or they might not, there's no way to tell). I know that this isn't what a 302 is for, but I'm wondering which would be better in this case: to create a 302 redirect or to do a full 301 and if the products are repurchased at a later date to create a whole new url. Hope that makes sense. Thanks, Kate
Technical SEO | | Lisaangel0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
HTTPS vs HTTP
A few months ago we switched all product page urls in our e-commerce site to https: Recently working on the site I was appalled at how slow our pages were loading and on investigating further with our hosting partner they advised to switch back to http instead of https for all of the site content to help page speed. We obviously still need to use https in the cart and check-out. I think that that Google will be pushing all commerce web pages to https but for now I need to improve page load speed. Will the switch back from https to http impair our keywords? https://www.silverandpewtergifts.com/
Technical SEO | | silverpewter0 -
Time to deindexing: WMT Request vs. Server not found
Google indexed some subdomains (13!) that were never supposed to exist, but apparently returned a 200 code when Google somehow crawled them. I can get these subdomains to return a "server not found" error by turning off wildcard subdomains at my DNS. I've been told that these subdomains will be deindexed just from this server not found error. I was going to use Webmaster Tools and verify each domain, but I'm on an economy goDaddy server and apparently subdomains just get forwarded to a directory, so subdomain.domain.com gets redirected to domain.com/subdomain. I'm not even sure with this being the case, if I can get WMT to recognize and remove these subdomains like that. Should I fret about this, or will the "server not found" message get Google to remove these soon enough?
Technical SEO | | erin_soc0 -
Is placing content in sub directories better for SERP
Hi For small web sites with less than 6 pages Is there a benefit to structuring url paths using keyword rich sub directories compared to pages in the root of the site. for example: domainname.co.uk/keywordpagename.html or www.domainname.co.uk/keyword/keywordpagename.html which seems to have better rankings? thanks keyword
Technical SEO | | Bristolweb0 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
Www vs non-www
We just had our site redesigned. Previously, it was indexed under www.suss.net, but now the developer has it at suss.net with www.suss.net 301 redirecting to suss.net. Is this bad for SEO?
Technical SEO | | kylesuss0 -
301 Redirect vs Domain Alias
We have hundreds of domains which are either alternate spelling of our primary domain or close keyword names we didn't want our competitor to get before us. The primary domain is running on a dedicated Windows server running IIS6 and set to a static IP. Since it is a static IP and not using host headers any domain pointed to the static IP will immediately show the contents of the site, however the domain will be whatever was typed. Which could be the primary domain or an alias. Two concerns. First, is it possible that Google would penalize us for the alias domains or dilute our primary domain "juice"? Second, we need to properly track traffic from the alias domains. We could make unique content for those performing well and sell or let expire those that are sending no traffic. It's not my goal to use the alias domains to artificially pump up our primary domain. We have them for spelling errors and direct traffic. What is the best practice for handling one or both of these issues?
Technical SEO | | briankb0