Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Wordtracker vs Google Keyword Tool
-
When I find keyword opportunities in Wordtracker, I'll sometimes run them through Adwords Keyword tool only to find that Google says these keywords have 0 search volume. Would you use these keywords even though Google says users aren't searching for them?
-
To specifically answer your question about the differences between WordTracker and the AdWords keyword tool, I examined the WordTracker site. I performed a keyword search for the phrase "depression and bipolar link". It showed 34 searches. To better understand what that result meant, I searched the site and located the following explanation:
"For the Wordtracker data, the Search count is the number of times each keyword appears in our database of searches over the past 365 days. This constitutes just under 1% of all US search, and the data is gathered from metacrawler.com and dogpile.com."
There are two key differences between AdWords data and WordTracker. AdWords clearly has a much larger data source so it should be more accurate. Also, AdWords data is presented based upon monthly searches, where WordTracker uses yearly searches. The AdWords result for "depression and bipolar link" would be 3 monthly searches. Since the result is less then 100, Google rounds the result to 0.
You are reaching for very long tail phrases. You will capture other keywords and shorter phrases in the process.
For example, while Adwords shows no traffic on "depression and bipolar link" the phrase "depression and bipolar" shows 165k monthly searches with medium competition. If I were to create a page, I would focus the article on "depression and bipolar". If you really wish to keep the focus on "depression and bipolar link", you can do such knowing you will capture traffic from other versions of the phrase.
-
Here's a couple that show a fairly decent search volume on Wordtracker & 0 on Adwords KW tool:
multiple sclerosis links with bipolar disorder
ank3 and bipolar disorder
depression and bipolar link
Thanks!
-
Can you share an example?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
M.ExampleSite vs mobile.ExampleSite vs ExampleSite.com
Hi, I have a call with a potential client tomorrow where all I know is that they are wigged-out about canonicalization, indexing and architecture for their three sites: m.ExampleSite.com mobile.ExampleSite.com ExampleSite.com The sites are pretty large... 350k for the mobiles and 5 million for the main site. They're a retailer with endless products. They're main site is not mobile-responsive, which is evidently why they have the m and mobile sites. Why two, I don't know. This is how they currently hand this: What would you suggest they do about this? The most comprehensive fix would be making the main site mobile responsive and 301 the old mobile sub domains to the main site. That's probably too much work for them. So, what more would you suggest and why? Your thoughts? Best... Mike P.S., Beneath my hand-drawn portrait avatar above it says "Staff" at this moment, which I am not. Some kind of bug I guess.
Intermediate & Advanced SEO | | 945010 -
Hreflang in vs. sitemap?
Hi all, I decided to identify alternate language pages of my site via sitemap to save our development team some time. I also like the idea of having leaner markup. However, my site has many alternate language and country page variations, so after creating a sitemap that includes mostly tier 1 and tier 2 level URLs, i now have a sitemap file that's 17mb. I did a couple google searches to see is sitemap file size can ever be an issue and found a discussion or two that suggested keeping the size small and a really old article that recommended keeping it < 10mb. Does the sitemap file size matter? GWT has verified the sitemap and appears to be indexing the URLs fine. Are there any particular benefits to specifying alternate versions of a URL in vs. sitemap? Thanks, -Eugene
Intermediate & Advanced SEO | | eugene_bgb0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Google Ranking Generally in Germany - Keywords & Umlauts
Hi Mozzers, I was hoping i could get some advice/opinions on a website ranking problem i have been working on, in particular one of the pages. This is our German language website which is hosted from Germany and a flaunt German speaking member of staff from our German office moderates the text content of the website for us.Our website seems to get good traffic ,visitor navigation and conversions. One of the keywords i focus building around is Schallpegelmessgerät which is one way of basically saying Sound level meter in German. The keyword uses an umlaut which i cannot use in the URL, but google is picking up and putting into the snippets, but apart from that our on-page optimization is good according to the moz tool. I have been trying to improve our content and we post many blog articles around the topic/keyword but google.de seems to choose not to even display this on the first couple of pages and sometimes ranks our blog articles around the third page. We are even been outranked by some low quality cheap online shop websites some of which with low quality content and low page and domain authorities. I had accepted this but after looking at bing.de and doing a search i find our page in the top 5 results, i understand that google and bing's algorhythms are different but just struggling to get my head around it all. Here is our website & page - http://www.cirrusresearch.de/produkte/schallpegelmessgerat/ Any advice on this situation would be greatly appreciated, thank you very much for reading this James
Intermediate & Advanced SEO | | Antony_Towle0 -
Google places keyword variations
Hi all, I have a site that is ranking #1 in Google Places for its main <city><keyword>search... but it does not rank for any of its basic keyword variations, which I find very confusing.</keyword></city> ie (just an example) Chicago Caterer (ranked #1 in google places)
Intermediate & Advanced SEO | | x2264983x
Chicago Caterers (not ranked in google places)
Chicago Catering (not ranked in google places)
Chicago Catering Company (not ranked in google places)
Chicago Catering Companies (etc..) How can I secure a google places ranking for these simple keyword variations? Do I build links to the google plus page using that anchor text? Do I get citations that contain that keyword somewhere on the page? Do I optimize for these keyword variations on the actual website itself? (not the places listing). Obviously I don't stuff these keywords into the google places listing. Any help would be much appreciated!0 -
Help - .ie vs .co.uk in google uk
We have a website that for years has attracted a high level of organic searches and had a very high level of links. It has the .ie extension (Ireland) and did very well when competing in the niche market it is in on google.co.uk. We have the same domain name but in .co.uk format and basically redirected traffic to it when people typed in .co.uk instead. Since the latest panda update, we have noticed that the number of visits organically has dropped to a quarter of what it was and this is continuing to go down. We have also noticed that the .ie version is no longer listed in google and has been replaced by .co.uk. As we've never exchanged or submitted links for the .co.uk domain this means there are only links indexed in google. Is there any way I can get google to re-index the site using the .ie domain rather than the .co.uk domain? I am hemorrhaging sales now and becoming a much more withdrawn person by the day!!! PS - the .co.uk domain is set up as a domain alias in plesk with both .ie and .co.uk domain dns pointing to the the same IP address. Kind Regards
Intermediate & Advanced SEO | | rufo
Steve0 -
Google and keywords with and without accents. How to approach optimization for both?
This is more of a problem for people optimizing for keywords in spanish, french, german and such. It is well known that SERPs for keywords with and without accents are different. However, I haven't been able to discover how do I make the incorrectly misspelled keywords rank without messing up the site's content. Another fact to take into account is that more than half the searches made in these languages are done without accents because, let's face it, it's just too much work. An example of my specific problem: The misspelled keyword "cursos de ingles" is currently ranking higher than the correctly spelled keyword "cursos de inglés". However, the misspelled keyword "clases de ingles" is not ranking at all and the correctly spelled keyword "clases de inglés" is on the first page. How is this possible? Now, how can I optimize the misspelled keywords to rank higher without misspelling the content on my site? Thank you! Capture.PNG
Intermediate & Advanced SEO | | 7decode0 -
How to Target Keyword Permutations
I have a client that wants to rank for a keyword phrase that has many permutations.. ex. "Alaska Hill Country Resort", "Hill Country Resort Alaska", "Hill Country Alaska Resort" But I'm wondering if I should target these all on the same page or not. I'm assuming all of these permutations are actually valid searches because I did my keyword research for 'exact match' keywords and got results like this.. (let me know if I'm missing something here, or if this sounds right) [Alaska Hill Country Resort] - 230 Local Searches [Hill Country Resort Alaska] - 140 Local Searches [Hill Country Alaska Resort] - 30 Local Searches The phrase we're targeting is their main keyword phrase, so I've chosen their home-page as the page to rank for this phrase. My thought is to optimize for the most popular phrase (ex. "Alaska Hill Country Resort"), and sprinkle in the other phrases throughout the copy. Next I would run a link-building campaign targeting the main phrase first.. then the next phrase, and so on, so that my anchor text is more heavily focused on the more popular terms, but I would also make sure to include the less popular terms. Do you think this is the best way to go about this? Do I really need to make individual pages for each of the permutations, or is it okay to target them all on one page since they are essentially the same keyword?
Intermediate & Advanced SEO | | ATMOSMarketing560