How do Moz tools handle signs for "PHRASE" and [EXACT MATCH] KW queries?
-
Hello,
As some of my projects are in competitive niche markets, I often chase exact match KW to handle with.
However, when using 'Moz >> Keyword Difficulty Report' I'm getting SAME Search Volume result - regardless using broad match, phrase or exact match KWs. Have I missed something?
Ranking result's and are however different and seems to correspond to different type of KW search ( which type of KW all its.
Please share some light on this.
-
The reply to the question is in reference to the 'Keyword Difficulty Tool'. For my own clarification does the statement "...our system automatically assumes whatever you enter as exact match by default." apply to all the MOZ tools, specifically the 'Keyword Ranking Reports'?
Searching on Google for the following, phase1tech.com comes up in the positions indicated:
CameraLink Cameras (#12)
"CameraLink Cameras" (#5)
[CameraLink Cameras] (#12) (it is my understanding that you can't do an exact match search on Google)MOZ's 'Keyword Ranking Tool/Report' indicates they are at positions #5. As such I assume the results are based on "exact match".
If that is the case I am wondering how many users actually use quotes when performing their searches. -
Hi there!
Thanks for reaching out to us! That's a great question What you are seeing in the keyword difficulty search is our tool using an exact match query into our funnel to pull your data, which is how our tool queries the search engines. Since we don't do broad matching on our end, that's the reason you are seeing identical search volumes. So while you are correct in using exact match quotations "." our system automatically assumes whatever you enter as exact match by default.
That of course I am seeing in some of the ranking results in your account, where small differences such as "woman" vs. "women" will pull up different results as the case that they are different words. Since our tools uses exact match as a default there would be no need to use quotation or brackets to direct us to pull exact matches
Best,
Peter
Moz Help Team
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bing SEM Traffic showing up as Organic in Moz?
Hi there, Everyone! First time poster. 🙂 One of my clients has an ads string showing up in organic search? It is: s=bing_search&p=none . If I block it in the robots text, then I won't see any traffic results.
Moz Pro | | TopGrowthHacker.com0 -
My moz campaing not update completely
I have create new account and campaign last week, till now my moz campaign not update like: keywords ranking, competitor websites links etc. Why it is so please update me.
Moz Pro | | Surabhi_Dewra0 -
Why is Moz Reporting as Duplicate Page Titles?
Our most recent MOZ crawl campaign is reporting 931 duplicate page title errors, most of which are "Product Review" pages like the following. Although there is only one review on this page, http://www.audiobooksonline.com/Cell_Stephen_King_unabridged_compact_discs.html, MOZ is reporting 15 duplicate page title, four of which I present below. http://www.audiobooksonline.com/reviews/review.php/full/0743554337/0/name/desc
Moz Pro | | lbohen
http://www.audiobooksonline.com/reviews/review.php/full/0743554337/0/rating/asc
http://www.audiobooksonline.com/reviews/review.php/full/0743554337/0/rating/desc
http://www.audiobooksonline.com/reviews/review.php/full/0743554337/0/state/asc Why is MOZ reporting these "pages" as duplicate page title errors? Are these errors hurting our SEO? How to fix?0 -
Pro newbie - What is a "campaign"?
I signed up three weeks ago and the first email to me begins with: The first step in PRO is to set up your campaign. Once we start tracking your site and social media accounts, you'll start receiving handy reports of your data. You can also surface site issues right in the app to help you prioritize which fixes you can make for immediate results. At this point I don't know what kind of "campaign" this is or what the "app" is. I have also lodged a ticket because something seems to be broken about my Pro status. I can obviously participate here, so I have a Pro status, but at OSE, it says "Social metrics only available to paid Moz subscribers. Learn more" and hit my Advanced Report limit for the day without seeing any such report.
Moz Pro | | trainSEM0 -
Crawl test from tools
Hi, I notice that the crawl test which is from the Research Tools doesn't really get a new crawl even though there is 2 crawl per day. It will only provide the data which was acquire from the crawl diagnostics in my pro account. There is no point for me to get the data which I get from my crawl diagnostic isn't it? Even seomoz provided with more than 2 crawl per day also useless in this case. This whole thing doesn't make sense as the crawl diagnostics will only perform a full crawl test once every week. but even the crawl test also not helping any thing out for me.
Moz Pro | | hanzoz0 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090 -
My tools have stopped working.
I just started my 30 day trial a week ago. Yesterday, the toolbar stopped showing PA or DA, and none of the tools will load for a page. I've tried logging back in several times and it hasn't helped.
Moz Pro | | HealingCrystal0 -
If there a basic overview of the SEOMoz Pro tools?
I'm surprised there is no simple overview of all the features anywhere, or a guide on how to use the tools? There's a nice intro video, and a good beginners guide, then BAM! - straight into the Pro Tools (it's a bit daunting for someone learning!) Or at least that I can find... and help or pointers would be great 🙂
Moz Pro | | seanuk0