Looking For URL Anchor Text Metrics Definitions
-
Running some keyword difficulty reports that are showing some interesting data around URL Anchor Text Metrics. But ti fully understand them, I need some definitions, which I cannot find anyone.
So can someone point me to definitions of these terms:
Exact Anchor Text Links
% Links w/ Exact Anchor Text
Linking Root Domains w/ Exact Anchor Text
% Linking Root Domains w/ Exact Anchor Text
Partial Anchor Text Links
% Links w/ Partial Anchor Text
Partial Anchor Text Root Doms.
% Linking Root Domains w/ Partial Anchor Text
Also, if say Exact Anchor Text Links is bolded purple, that means that URL has more Exact Anchor Text Links than any other URL in the report. Is that correct?
Thanx
David
-
David,
t'm going to answer this question 1st part last last part 1st.
if you see what appears to be a highlighted purplish glow in the comparisons regardless of whether it's about anchor text or not it that symbolizes has a better position in the search engines or it's the preferred choice is more optimized in other words or winter of the comparison
you are having trouble because you're seeing cut off anchor text? use open site Explorer look for the tab that says anchor text there are so many other ways to find what find your asking please use the links below
http://pro.seomoz.org/tools/crawl-test
http://www.opensiteexplorer.org/
http://www.seomoz.org/beginners-guide-to-seo
I hope I was of help
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Solving URL Too Long Issues
Moz.com is reporting that many URL's are to long, these particularly affect Product URL's where the URL is typically https://www.domainname.com/collections/category-name/products/product-name, (You guessed it we're using Shopify). However, we use Canonicals that ignore all most of the URL and are just structured https://www.domainname.com/product-name, so Google should be reading the Canonical and not the long-winded version. However, Moz cannot seem to spot this... does anyone else have this problem and how to solve so that we can satisfy the Moz.com crawl engine?
Moz Pro | | Tildenet0 -
Magento creating odd URL's, no idea why. GWT reporting 404 errors
Hi Mozzes! Problem 1 GWT and Moz, both are reporting approximately one hundred 404 errors for certain URL's. Examples shown below. We have no idea why or how these URL's are being created in Magento. Any hypothesis on the matter would be appreciated. The domain name in question is http://www.artorca.com/ These are valid URL's if /privacy is removed. The first URL is for a product, second for an artist profile and third for a CMS page 1. semi-abstract-landscape/privacy 2. jose-de-la-barra/privacy 3. seller-guide/privacy What may be the source for these URL's? What solution should we implement to fix existing 404's? 301 redirects should be fine? Problem 2 Website pages seem to also be accessible with index.php in the domain name. Example Artorca.com/index.php/URL's. Will this cause a duplicate content issue? Should we implement 301's, canonicals, or just leave as is? Cheers! MozAddict
Moz Pro | | MozAddict0 -
Getting a URL Unaccessible on the page grader
I'm optimizing a site for a financial advisor, here is the site: http://www.mattkeenancfp.com I am getting the message "that URL is unaccessible" when I try to use the on-page grader. This is an emerald website too, I'm not sure if that has any effect on anything though.
Moz Pro | | ryanbilak0 -
Why does it keep displaying br tags and claiming 404 errors on like 4 of my URL's for all my Wordpress sites?
Is anyone else having the same issue? These errors don't actually exist and i think it has something to do with wordpress - how can i fix this?
Moz Pro | | MillerPR0 -
Dot Net Nuke generating long URL showing up as crawl errors!
Since early July a DotNetNuke site is generating long urls that are showing in campaigns as crawl errors: long url, duplicate content, duplicate page title. URL: http://www.wakefieldpetvet.com/Home/tabid/223/ctl/SendPassword/Default.aspx?returnurl=%2F Is this a problem with DNN or a nuance to be ignored? Can it be controlled? Google webmaster tools shows no crawl errors like this.
Moz Pro | | EricSchmidt0 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090 -
Dead links-urls
What is the quickest way to get Google to clean up dead
Moz Pro | | 1step2heaven
link? I have 74,000 dead links reported back, i have added a robot txt to
disallow and added on Google list remove from my webmaster tool 4 months ago.
The same dead links also show on the open site explores. Thanks0 -
Impact of 301-redirected domains on OSE Metrics
When looking at the OSE metrics (DA, PA, Number Linking RDs etc.) is it purely based on OSEs evaluation of the specific domain or will it take into account links that have been 301-redirected to the domain?
Moz Pro | | bjalc20110