Hi Alan, it's an odd response. The traffic numbers are pulled from Google Analytics. My belief was that GA is capitalization agnostic - meaning it doesn't matter if the keyword is capitalized or not. Let's see if we can get an SEOmoz engineer to weigh in.
Posts made by Cyrus-Shepard
-
RE: Capitalisation of campaign keywords - why does this affect traffic but not rankings?
-
RE: Over 90% of anchor text tends to be brand-name on OSE link profiles. Why?
Manism is one word and will most likely be interrupted as such by the Search Engines. Since it is thematically related to "man" it might still pass some benefit if contained in anchor text.
Whereas "askmen" is interpreted as 2 separate words. This might pass more value, but the effect is most likely minimal as Google becomes more sophisticated.
-
RE: Over 90% of anchor text tends to be brand-name on OSE link profiles. Why?
1. Ideally, but not always the case with smaller sites, local sites and folks who can't get their own branded domain. But you are correct, there's a great deal of difference between a keyword rich domain and a non-keyword domain.
Unfortunately, there's no "correct" answer to the ideal ratio, although 90% branded seems incredibly high to me. In fact, I wish fewer of SEOmoz's inbound links were branded, as we could benefit from more variety.
Here's a video I made on anchor text
Along with some related resources which address the topic:
-
RE: Over 90% of anchor text tends to be brand-name on OSE link profiles. Why?
Hi Zachary,
I apologize because I misunderstood your question the first time.
Going back to your original question....
often I explore these links and find that the pages include both a brand-name link AND a regular keyword link, but for some reason OSE was only reporting the brand-name link...
OSE will report the first link found on a page's HTML, and only the first link. This is consistent with testing done on how Google counts anchor text. (If you find something different, it would indeed be a bug)
As for the prevalence of branded anchor text, I think there are a couple of factors at work here:
- The Domains are exact match for their brand, which accounts for the high proportion of branded anchors. See the stats for SEOmoz
- Large online publishers like the examples you cited tend to belong to publishing groups that own a large network of sites. The sites tend to interlink to each other using the same anchors.
Having a keyword rich brand name can really help you in the ranking department. So can developing search volume for your branded terms.
In a vaccuum, I'm not sure branded anchors pass greater or lesser value than others. In the end, you want anchors closely associated with the keyword topics you are trying to rank for. If these are your brand keywords, all the better.
-
RE: Campaigns - domains / sub domains
You can set up each subdomain on a campaign, or you can set them all up on a root domain campaign. If you choose to set up subdomain campaigns, each will be crawled up to 10,000 pages.
Remember, each campaign will use a slot. For most PRO members, you are limited to 5 campaigns.
Setup is simple. You can choose either root domain or subdomain on the setup screen, and enter your appropriate URL. More information in the help documentation here.
Hope this helps!
-
RE: On raking reports
Really great question that boils down to economics, scale and accuracy more than anything else.
The good news is that independent testing has shown that Moz keyword tracking, especially in the PRO campaigns (which uses slightly more sophisticated tech than the stand alone Rank Tracker) provides a greater degree of accuracy than most other providers.
The biggest hurdle to daily tracking is processing power. When you track as many keywords as SEOmoz across various search engines across the globe, it actually requires more processing power than other companies have to muster. At scale, this causes multiple challenges, including cost and server resources.
Tracking also means checking with Google, so Moz has to be careful about not violating Google's TOS (terms of service) by making large requests. Other companies play loose and fast with Google's rules - and that's something Moz won't risk as the data is too valuable for our customers.
And because keywords naturally fluctuate daily, we've found that tracking keywords once per week provides more actionable metrics for most folks.
-
RE: Over 90% of anchor text tends to be brand-name on OSE link profiles. Why?
In addition to Robert's answer, I'd like to add that although OSE doesn't have a built in bias towards "name-brand" links, it does favor links with high Page and Domain Authority, and URL that have good inbound link profiles themselves, because these are the most useful to crawl and index.
Of course, OSE doesn't report all links, especially those buried beneath layers of navigation and those on pages with very few inbound links themselves.
Part of the issue may be temporary as well. Each index changes from the last. The most recent index was an improvement in that it reported far more domains, but significantly fewer URLs. This is a trend we may see reversed in future updates.
-
RE: Why is the SEOmoz crawler crawling the old version of our website?
The only way that SEOmoz will crawl your old URLs is if that is what your web servers is delivering to the SEOmoz crawlers. In this case, you might want to make sure all your redirect are in place.
I did a quick check of the URL listed in your public profile using the tool URI valet, and found at least 4 redirects from the homepage, including a 302, which passes no link juice. This is definitely something you want to look into.
Generally the SEOmoz robots will crawl your site once a week, so 99% of the time you will see new infomation contained in your campaign within 7-10 day. If you suspect an error, be sure to contact the help team (help@seomoz.org)
Also, here's my favorite article on site migration. Although you didn't migrate a complete site, many of the principals still apply.
https://seogadget.co.uk/surviving-seo-site-migration/
Best of luck with your SEO!
-
RE: Where this keyword comes from?
Hi Denis,
If I understand correctly, you have a PRO campaign that is grading a URL/Keyword combination that you didn't intend.
The On-Page tool automatically generates reports for any of your tracked keywords that rank in the top 50, and it grades them against the URL that is ranking. So generally, it can only get these keyword phrases from your tracked keywords.
Sometimes these are not the keyword/URL combinations you want to grade. Fortunately there's an easy way grade any keyword you want. Here's some instructions from the Help Guide I wrote...
For example, your "Contact Us" page ranks #5 for the keyword phrase "Yellow Shoes" and the Web App grades it a B. What if you really want to target your homepage for this keyword phase? Here’s how:
1. Hit **Report Card **at the top of the On-page summary.
2. Choose your Keyword you want to grade. The keyword must already be included in your campaign. Select Manage Keywords if you need to add keywords.
3. Enter the URL of the page you would like to grade.
4. Hit Grade My On-Page Optimization to generate your report.
Hope this helps!
-
RE: Grading issues on my weekly report
Hi Diane,
A few possibilities here.
1. Small changes can have a big impact. Even changing a single paragraph can cause your grade to go from an A to an F, or vice versa.
Fortunately, the complete report will tell you exactly how it scored your page. If you drill down into the report, it will even tell you what areas you need to fix.
You can see the full drill down by clicking on the keyword in the report overview. Here's a screenshot.
2. Check to make sure it's the same URL that is ranking. The On page report generates a report for the URL that is ranking for the current week, which can change from time to time. If a different URL is ranking this week from last, then the On-page grader will grade the new URL, which might have a completely different grade.
3. Unfortunately, without knowing the URL or keyword it's hard to tell you exactly what's going on. Feel free to ask a private question, or ping the help team if you want further help and you are not comfortable sharing this information in public.
Good luck!
-
RE: Why is Followed Linking Root Domains higher than External Followed Links?
There are some strange outlier instances, but in this case a domain that links to itself counts as a linking domain. (since it's entirely possible to have a domain not link to itself, we have to count it when it does.)
So, a domain that links only to itself, with no other links, would have 1 linking root domain and 0 external links.
-
RE: Y!SE and OSE
The truth is, you can trust them all, but every index will always report different numbers.
For example, it's estimated that Google's index was 2x the size of Yahoo's.
The web is full of links. Billions and billions of them. The problem is that 80% of them are junk. Most indexes maintained by companies like Google, Bing, Yahoo, and even SEOmoz and Majestic can't afford to waste the resources crawling, processing and storing all of those junk links. So the question doesn't become what you keep, what's more important is what you throw away.
Yahoo tended to report a lot of links. It was kind of fun because they reported many links right away, even if they weren't important and didn't help you to rank.
Linkscape (the name of SEOmoz's index) tries very hard to produce a quality listing of links - links that actually have an effect on crawling, indexing and ranking. The huge advantage of Linkscape over Yahoo is that OSE will actually tell you the value of a link through metrics like Domain Authority, Page Authority and MozTrust. You couldn't get these metrics from Yahoo.
Our friends at Majestic wrote a post that we featured on the SEOmoz blog highlighting this difficulty.
http://www.seomoz.org/blog/why-counting-links-is-not-so-easy
No index will report every link, but OSE will represent the majority of links that matter, and provide actionable metrics to go along with them.
-
RE: Link Diagnosis and Open Site Explorer
SEOmoz provides an API of it's Linkscape data that other companies, even individuals can use for their own purposes. Oftentimes companies buy this data and use it in their own reports. Sometimes they use all of the data, sometimes just a portion of it.
Unfortunately, there's no good way for me to tell how Link Diagnosis uses the SEOmoz data. To save money and resources, some companies "cache" the data - which works fine as long as they update the cache regularly. I'm not saying this to disparage Link Diagnosis, I simply don't know enough about them.
That said, you can be assured that Open Site Explorer always has the latest, freshest and most accurate OSE data. Every time you use OSE, fresh data is served with no long-term caching, so it proves itself a very reliable tool for webmasters.
-
RE: 302 (Temporary Redirect) up and growing, how to fix?
Hi Éber,
In my opinion, it's totally legitimate to nofollow those links to your login page, assuming they are showing the same 302 redirect and login page to Google. This will cut down on excessive crawling and possibly help you out with your Google Crawl "allowance."
If, on the other hand, you wanted to direct some link juice towards your login page, you could turn the 302 into a 301.
There might be other solutions, such as blocking the page with robots.txt, or using clever javascript. That said, nofollow is probably the easiest.
-
RE: Open Site explorer (last 1-3 updates) shows new and wierd results
Hi Frederik,
Domain Authority is highly correlated with both mozRank and mozTrust, and includes factors from both of these metrics into the final Domain Authority score. Most of the time, a site with both superior Domain MozRank and superior MozTrust will also have a higher Domain Authority, but not all the time.
In this instance, I think we are seeing an edge case where Linkscape was reporting funny numbers. In the time since you posted your original question, not only Linkscape has been updated but the way Domain Authority metrics has also been updated for the first time in a couple years. The old Linkscape calculations were accidently including certain types of binary and junk files. If Duft og Natur had a large number of these types of files included in the Index, it could have certainly thrown the numbers off.
The end result is more accuracy today.
In fact, looks like Netspiren now scores a DA of 45, whereas Duft og Natur scores a 40. Nice!
-
RE: Block all but one URL in a directory using robots.txt?
Robots.txt files are sequential, which means they follow directives in the order they appear. So if two directives conflict, they will follow the last one.
So the simple way to do this is to disallow all files first, then allow the directory you want next. It would look something like this:
User-agent: *
Disallow: /User-agent: *
Allow: /testCaveat: This is NOT the way robots.txt is supposed to work. By design, robots.txt is designed for disallowing, and technically you shouldn't ever have to use it for allowing. That said, this should work pretty well.
You can check your work in Google Webmaster, which has a robots.txt checker. Site Configuration > Crawler Access. Just type in your proposed robots.txt, then a test URL and you should be good to go.
Hope this helps!
-
RE: Why "title missing or empty" when title tag exists?
Hi Loren,
I took a peek at your website, and checked some things behind the scenes using my super-awesome administrative powers here at SEOmoz. It looks like one of two things happened.
- Rogerbot encountered an error when crawling your site
- Your site had trouble with rogerbot.
In either case, you probably want to contact the help team (help@seomoz.org), especially if the problem persist in the next crawl report.
On another note..
Those extra-long title tags might cause some crawlers a little confusion. I'm not saying they're bad for you, but I doubt they are helping you much from a search engine point of view. Undoubtedly, I'd say with near certainty that Google is not indexing the entirety of your title tags. Paginated lists like this are tough to get indexed properly. If folks are actually searching for these obscure part numbers, perhaps this is the only way to scale it. That said, I would encourage you to experiment.
-
RE: Ranking Tracking Internal Folders
Hi Ilse,
If you are a PRO member, there's a super simple way to do this within the campaign platform.
1. Start a new campaign, and at the very beginning you'll see a screen like this:
<fieldset class="setup">
Which parts of your domain would you like to track for this campaign?
<label for="campaign_campaign_type_subdomain"> Subdomain</label>
Track one subdomain.
Examples: www.seomoz.org, guides.seomoz.org, pro.seomoz.org
<label for="campaign_campaign_type_rootdomain"> Root Domain</label>
Track at root domain level. Track all the different subdomains within this root domain.
Example: The root domain seomoz.org has www.seomoz.org, guides.seomoz.org, and pro.seomoz.org all as subdomains within the root domain. If we discover pages on any of the subdomains during our crawl, they'll be included in the data we display.
<label for="campaign_campaign_type_subfolder"> Subfolder</label>
Track a specific subfolder. Note that the subfolder path must resolve or we will not be able to crawl the content of that folder.
Examples: www.seomoz.org/tools, guides.seomoz.org/beginners-guide-to-search-engine-optimization
</fieldset>
2. Make sure to pick subfolder, the last choice. Then enter the specific folder path for your campaign, like www. domain.com/za
This will track rankings for this specific subfolder only. Easy as pie!
-
RE: Campaigns: How to Decide What to Track?
Hi Suzanne,
The way most SEOs work is to move from low-hanging fruit to more challenging opportunities.
1. For most new users of SEOmoz who run their first PRO campaign, this means addressing the site audit in your crawl report, starting with the errors first.
You said you have a large site with 1000s of pages, so it's likely the crawl found 100s or even 1000s of errors on your site. This can make it a little overwhelming to know where to start.
Take advantage of the Help Guides. Here's the guide for the Crawl Diagnostics Tab. Read this top to bottom. If you don't know what something means, this is an excellent time to start researching. If you are faced with an overwhelming number of errors, there are a couple things you can do to prioritize.
- Sort the URLs of the errors by Page Authority. This will help you address your most important pages first.
- Export your crawl results to CSV. Once they are in a spreadsheet form, start to look for patterns. Multiple errors are often caused by the same problem, so making a small fix can sometimes lead to big results.
2. Add more keywords to your campaign. You can get these from Google Analytics. This Help Guide will guide you. Read articles on Keyword Research and find out what keywords you would like to rank higher for. Start to devise a plan.
3. Have you connected a Google Analytics account? This is extremely helpful. Yep, the Help Guides assist with that, too.
4. Check your On-Page reports. These are generated for any keyword/URL ranking in the top 50. Again, can you spot any patterns? Any changes you can make across your site to improve the keyword relevance of your pages? Try improving a handful of pages that already rank well for high volume keywords and see what happens. Experiment.
5. Analyze your links. Use Open Site Explorer Top Pages tab to make sure links to your site aren't being lost to broken URLs. Find out where are your best links are coming from and figure out how to do more of that. Look at your incoming anchor text. Does it mirror the keywords you are ranking for?
(Really, this is a much of a learning process as anything. But each step takes you further)
Examine the links of your competitor. Where did those links come from, and can you earn those same links? Where are your competitors stronger than you? How can you catch up? The keyword difficulty tool is great for this too.
6. Install the MozBar, if you haven't already. Start surfing everywhere with it. Use the tools to analyze pages and figuring out why they rank where they rank.
7. Even though your site has been around for years, there may have been some things overlooked. So pretend it's brand new . Watch this video from Rand about launching a site.
8. When you get stuck, browse old blog post, search Google, or ask a question here. The web is full of SEOs willing to help.
Hope this helps. Best of luck with your SEO!
-
RE: Http and Https Update
Hi Matthew,
You definitely want your pages to resolve to one version or another, either http or https. Don't leave it for the search engines to sort it out.
For instance, take a look at Paypal, which redirects every single URL to https.
Google and all major search crawlers can now handle https with ease, but if you place your content on 2 different URLs, this can count as dupe content.
If the https pages actually redirect (via 301) to http, there is no issue of cloaking.
Does that help?
-
RE: Set up of SEOMoz campaign for specific country within a global website
Additionally, don't forget to set the correct search engines. Do you want to track your progress using Google US, or Google UK?
You can change your search engines under the "campaign settings" tab underneath the Campaign Overview. You can select from over 150 operated by Google, Bing and Yahoo around the world.
-
RE: Traffic falling
First of all you want to check your Google Analytics account directly. The SEOmoz PRO platform only tracks organic (non-paid) traffic, and you also want to make sure your PRO campaign is attached to the right profile.
You can double check your rankings for your top referring keywords (from Google Analytics). Make sure they haven't fallen. If they haven't, then it's unlikely Google is penalizing you for anything.
Sometimes it's a matter of seasonal traffic fluctuations. This especially happens around the holidays. You can check Google Insights to see if your major keywords are seasonal.
Also check your PRO campaign for any increase in errors, to make sure all your content can be indexed properly.
Finally, sometimes your rankings, and your traffic, actually do drop. Freshen up your content, build good links and you should be on the road to recovery.
-
RE: What effect does previous page visits have in SERP?
Really interesting question about user engagement metrics that we don't have a clear answer to, but we've received hints from the engines that they track this sort of thing through toolbars, logged in searches and other methods.
Bill Slawski recently wrote a post on a Google patent that would adjust rankings on exactly this type of behavior. Quoting directly from the his article, the patent described user signals such as:
- The percentage of searches in which the user selected the first result (or one of the top results) in the list of search results
- The average first click position (i.e., the numerical position within the list of results)
- The percentage of searches that had long clicks (i.e., the percentage of times that a user selects a link to go to a result page and stays on that page for a long time, such as more than 3 minutes)
- The percentage of searches that did not have another search within a short period of time
- The percentage of searches that did not have a reformulated search (i.e., a search where one or more search terms in the original search are added, deleted, or changed) within a short period of time
- A combination of different metrics, and/or the like
So a single click, in the ocean of web results, probably will never make a large difference. Or even multiple visits by a single user.
But if you have a large number of users click on a result, and then click the back button to choose another search result - this is almost definitely going to have impact on rankings. Or, another example, if a large number of searchers consistently choose the URL in the 3rd position of a given SERP - and staying on that site - that particular domain might have a good chance of rising.
This past April, Google announced "we are beginning to incorporate data about the sites that users block into our algorithms." Again, a single person blocking a site from their search results probably isn't going to have an impact, but a large number of such actions probably will.
Engagement metrics are becoming increasing important, but all indications are they must be taken in aggregate.
-
RE: Can Open Site Explorer Do This?
I've endorsed Ryan's answer but just a quick note about PageRank.
We encourage folks not to use PageRank as an indicator of ranking strenght, as "PageRank has a relatively low correlation to how well things actually rank." - Rand Fishkin
PageRank is good for a couple of things, such as a raw indication of link popularity, or as Ryan mentioned to determine a penalty.
PageRank data is historically inaccurate, Google doesn't update it regularly enough to be useful and finally Google has asked us politely not to scrape PageRank data, and SEOmoz honors that request. So those are a few of the reasons we don't include it in the MozBar (SEO Toolbar) and other places.
Rand discusses the subject in depth in this WBF
http://www.seomoz.org/blog/what-is-googles-pagerank-good-for-whiteboard-friday
-
RE: Opensiteexplorer - not accuate for me
Hi Paul,
Linkscape is updated every few weeks, and with each new index old links are cleaned out. Occassionally you can find links that no longer exist, but Linkscape will look for these links during it's next crawl of the web, and only include them if found. So any non-existent links tend to disappear in a few weeks.
This is actually a very interesting problem for web crawlers, because 50% of the entire web disappears every year, and 80% disappears ever two years! Keeping up with this churn is a momentous task. In fact, over half the time it takes to publish a new Linkscape index is spent on processing time to ensure it's accuracy.
As Ari mentioned you can try Majestic SEO. You can also find some (highly inconsistent) backlink information in Webmaster Tools. No two link reports from different sources will ever be the same so you will definitely see different results. That said, if I had to choose one source for backlink information I prefer the numerous advantages offered by OSE.
Occasionally OSE reports links that are "hidden" or otherwise hard to find. In extreme cases it's best to check the source code of the website to ensure the links are actually missing.
Also, feel free to report any possible OSE errors to the SEOmoz help team (help@seomoz.org) Any information can help make the tool better.
Best of Luck with your SEO!
-
RE: Is SEOMoz better than Traffic Travis ?
I can't speak much about the qualities of Traffic Travis, in part because I know of no serious SEO that uses it. This isn't a judgement of the software, but has more to do with experience level. SEOmoz is geared for intermediate and advanced SEOs (and beginners who are willing to learn - all the resources are here) Whereas as Traffic Travis seems more aimed at the beginner SEO or those not able to invest as much time and/or resources into the process. (Just guessing here)
The last thing I want to do is "sell" SEOmoz or disparage a competitor, but there are a lot of things that SEOmoz offers that I don't think can be matched by a one-time desktop software purchase. These include:
- SEOmoz Metrics - Gold Standard here. Moz metrics are used (and resold) by more companies than any other SEO metrics on the market. And metrics like Domain Authority are correlated with rankings closer to anything else. An SEOmoz membership gives you full access to Open Site Explorer and complete backlink analysis - the same data that other companies buy direct from SEOmoz.
- Weekly crawls and reporting by rogerbot - the SEOmoz crawler. This is one of the most complete and easy to use SEO site audits around.
- Learning resources. This includes monthly webinars, full access to the PRO Q&A and more.
- Special resources like full access to the PRO only Directory List
- A full suite of SEO tools, including complete access to the MozBar.
- And a kick-ass community.
Over 50% of SEOmoz members stay 18 months or longer, and many have stayed for years.
At the end of the day, I always encourage folks to try the Free Trial (now it sounds like I'm selling, damnit). But it's a no-risk affair and you'll probably learn more in one week than you have in the past few months.
-
RE: How come the external on-page report tool is better than the campaign one?
Hi Steve,
Thanks for the question, but I think I'm missing something. What features are you referring to? Both the External and Campaign based On-Page tool grade:
- 4 Critical Factors
- 7 High Importance Factors
- 9 Moderate Factors
- 11 Low Importance Factors
- 5 Optional Factors
I thought they were exactly the same, but I'm only a casual user of this tool, so I honestly might be missing some features here. Please let me know!