How to get local search volumes?
-
Hi Guys,
I want to get search volumes for "carpet cleaning" for certain areas in Sydney, Australia.
I'm using this process:
- Choose to ‘Search for new keyword and ad group ideas’.
- Enter the main keywords regarding your product / service
- Remove any default country targeting
- Specify your chosen location (s) by targeting specific cities / regions
- Click to ‘Get ideas’
The problem is none of the areas, even popular ones (like north sydney, surry hills, newtown, manly) are appearing and Google keyword tool, no matches.
Is there any other tools or sources of data i can use to get accurate search volumes for these areas?
Any recommendations would be very much appreciated.
Cheers
-
Of course every single area is not going to be listed so the best way would be to search for the closest nearby city - in this case Sydney and adjust by population,
So if Sydney had 10,000 searches for carpet cleaning and 20x higher population than Manly then one would assume a search volume around 500. The assumption being that the search will not differ so greatly for towns and cities in close proximity.
I hope that helps
Regards
Nigel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
404 vs 410 Across Search Engines
We are removing a large number of URLs permanently. We care about rankings for search engines other than Google such as Yahoo-Bing, who don't even list https status 410 code option: https://docs.microsoft.com/en-us/bingmaps/spatial-data-services/status-codes-and-error-handling Does anyone know how search engines other than Google handle 410 vs 404 status? For pages permanently being removed John Mueller at Google has stated "From our point of view, in the mid term/long term, a 404 is the same as a 410 for us. So in both of these cases, we drop those URLs from our index. We generally reduce crawling a little bit of those URLs so that we don’t spend too much time crawling things that we know don’t exist. The subtle difference here is that a 410 will sometimes fall out a little bit faster than a 404. But usually, we’re talking on the order of a couple days or so. So if you’re just removing content naturally, then that’s perfectly fine to use either one." Any information or thoughts? Thanks
Intermediate & Advanced SEO | | sb10300 -
Mapping ALL search data for a broad topic
Hi All As our company becomes a bigger and bigger entity I'm trying to figure out how I can create more autonomy. One of the key areas that needs fixing is briefing the writers on articles based on keywords. We're not just trying to go after the low hanging fruit or the big money keywords but actually comprehensively cover every topic and provide actual good quality up to date info (surprisingly rare in a competitive niche) and eventually cover pretty much every topic there is. We generally work on a 3 tier system on a folder level, topics and then sub-topics. The challenge is getting an agency to: a) be able to pull all of the data without being knowledgeable in our specific industry. We're specialists and, thus, target people that need specialist expertise as well as more mainstream stuff (the stuff that run of the mill people wouldn't know about). b) know where it all fits topically as we kind of organise the content on a heirarchy basis. And we generally cover multiple smaller topics within articles. Am I asking for the impossible here? It's the one area of the business I feel the most nervous about creating autonomy with. Can we become be as extensive and comprehensive as a wiki-type website without having somebody within the business that knows it providing the keyword research. I did a searh for all data using the main two seed keywords for this subject on ahrefs and it came up with 168000 lines of spreadsheet data. Obviously this went way beyond the maximum I was allowed to export. Interested in feedback and, if any agencies are up for the challenge, do let me know! I've been using moz pro for a long time but have never posted and apologise if what I'm describing is being explained badly here. Requirements Keywords to cover all (broad niche) related queries in the UK, no relevant uk (broad niche) keywords will be missed Organised in a way that can be interpreted as article brief and folder structure instructions. Questions How would you ensure you cover every single keyword? Assuming no specialist X knowledge, how will you be able to map content and know which search queries belong in which topics and in what order. Also (where there is keyword leakage from other regions) how will you know which are UK terms and which aren’t? With minimal X knowledge – how will you know whether you’ve missed an opportunity or not (what you don’t know you don’t know) What specific resources will you require from us in order for this to work? What format will the data be provided to us in - how will you present the finished work so that it can be turned into article briefs?
Intermediate & Advanced SEO | | d.bird0 -
How do you make short tail keyword determinations when combined with long tail when there is not enough search volume to provide info
Confusing question, allow me to elaborate We have a few pages that target a particular doctor for example. One of those pages is about his backround. His short tail is his name "Dr Irving Weiss" for example, low competition of course and already too low of search volume to show in Google keywords tool (which i know isn't the best tool) so now one of his tab pages addresses info on his medical background (credentials, schools, awards, sanctions) if you search for those single keywords alone you get something like this hospitals 74,000 background 673,000 credentials 49,000 but that doesn't necessarily mean more people will search like "dr irving weiss background" than they will "dr irving weiss credentials" just because background has more searches "dr irving weiss background" and "dr irving weiss credentials are way too low search volume to have any data on, so how can you come to a proper keyword targeting conclusion when the data is not there? THANKS IN ADVANCE for any insight!
Intermediate & Advanced SEO | | irvingw0 -
I need help with a local tax lawyer website that just doesn't get traffic
We've been doing a little bit of linkbuilding and content development for this site on and off for the last year or so: http://www.olsonirstaxattorney.com/ We're trying to rank her for "Denver tax attorney," but in all honesty we just don't have the budget to hit the first page for that term, so it doesn't surprise me that we're invisible. However, my problem is that the site gets almost NO traffic. There are days when Google doesn't send more than 2-3 visitors (yikes). Every site in our portfolio gets at least a few hundred visits a month, so I'm thinking that I'm missing something really obvious on this site. I would expect that we'd get some type of traffic considering the amount of content the site has, (about 100 pages of unique content, give or take) and some of the basic linkbuilding work we've done (we just got an infographic published to a few decent quality sites, including a nice placement on the lawyer.com blog). However, we're still getting almost no organic traffic from Google or Bing. Any ideas as to why? GWMT doesn't show a penalty, doesn't identify any site health issues, etc. Other notes: Unbeknownst to me, the client had cut and pasted IRS newsletters as blog posts. I found out about all this duplicate content last November, and we added "noindex" tags to all of those duplicated pages. The site has never been carefully maintained by the client. She's very busy, so adding content has never been a priority, and we don't have a lot of budget to justify blogging on a regular basis AND doing some of the linkbuilding work we've done (guest posts and infographic).
Intermediate & Advanced SEO | | JasonLancaster0 -
Local SEO for a community.
How would one go about best doing local SEO for a townhome community? It seems to fall in between the traditional imformational SEO and the brick and mortar, G+ page model. There seems to be no way to attack the NAP, directory and traditional citation model for a certain region they build in. Any thoughts?
Intermediate & Advanced SEO | | AESEO0 -
Soft Hyphenation: Influence on Search Engines
Does anyone have experience on soft hyphenation and its effects on rankings? We are planning to use in our company blog to improve the layout. Currently, every word above 4 syllable will be soft hyphenated.
Intermediate & Advanced SEO | | zeepartner
This seems to render okay in all browsers, but it might be a problem with IE9... In HTML 5, the "" soft hyphenation seems to be replaced with the <wbr> Tag (http://www.w3schools.com/html5/tag_wbr.asp) and i don't find anything else about soft-hyphenation in the specs. Any experiences or opinions about this? Do you think it affects rankings if there are a lot of soft hyphens in the text? Does it still make sense to use or would you switch to <wbr> already?0 -
How Fast Is Too Fast to Increase Page Volume of Your Site
I am working on a project that is basically a site to list apartment for rent (similar to apartments.com or rent.com). We want to add a bunch of amenity pages, price pages, etc. Basically increasing the page count on the site and helping users be able to have more pages relevant to their searches and long tail phrases. So an example page would be Denver apartments with a pool would be one page and Seattle apartments under 900 would be another page etc. By doing this we will take the site from about 14,000 pages or so to over 2 million by the time we add a list of amenities to every city in the US. My question is should I worry about time release on them? Meaning do you think we would get penalized for launching that many pages overnight or over the course of a week? How fast is too fast to increase the content on your site? The site about a year old and we are not trying to scam anything just looking to site functionality and page volume. Any advice?
Intermediate & Advanced SEO | | ioV0