Agreed - that must be a mistake on our end. Can you send an email to help@seomoz.org with the URLs in question (just a sample of 4 or 5 would help) and we'll figure out the problem. We shouldn't be counting those for any warnings at all!
Posts made by randfish
-
RE: How does SeoMoz works with noindex meta tags?
-
RE: Correlation between PageRank and MozRank
Bing's linkfromdomain: command might be useful just to get a sense of what the site links to generally. But, I'd probably only do that if you have a sense that you should be worried (e.g. there's weird sitewides or they run a spammy reciprocal links directory or sell links somewhere, etc).
-
RE: Correlation between PageRank and MozRank
Linking Root Domains is pretty good as a raw metric. I actually did a whole whiteboard Friday on uses for link metrics specifically for link analysis - http://www.seomoz.org/blog/which-link-metrics-should-i-use-part-2-of-2-whiteboard-friday
Honestly, in addition to Moz metrics, I'd be looking at where and whether the site/page links out (if they link to crappy spots, it could be trouble) and whether the page ranks well for the keywords it targets (pages that don't rank in the top 5 for obvious title match KW searches might be trouble).
-
RE: Correlation between PageRank and MozRank
We've found this to be very close as well! I believe the correlation coefficient was around 0.85 last time we ran it.
In terms of the value of mozRank - our intent was always to mimic Google's PageRank algo, not to build a metric that best represents how a page/site might rank. You're totally spot on to say PA/DA are a good choice for that (and we have some tests running to make them even better in the near future).
BTW - I should mention here since it's relevant that Linkscape's estimated index update of July 11 is now moved back to July 20th due to some processing errors on the Amazon cluster (and some mistakes on our end, too). Sorry for the delay, but on the plus side, we do expect it to be quite a high quality index update (deeper crawls and better domain diversity, too).
-
RE: Why there is no even close correlation between MajesticSEO data and Open Site? Explorer?
We did a bunch of work on this a while back, so my stats are probably not up to date (I think around October of 2010). Basically, we compared Yahoo! Site Explorer numbers, Google numbers (via the crappy but somewhat proportional link: command), Exalead, SEOmoz, Majestic and Alexa.
Majestic was definitely odd and so was Alexa. Neither of those two mapped/correlated well to the quantities reported by the others. Thus, for example, if xyz.com has:
- 50 links according to Google
- 1,000 links according to Yahoo!
- 500 links according to SEOmoz (Linkscape/OSE)
And site abc.com has:
- 100 links according to Google
- 2,000 links according to Yahoo!
- 1,000 links according to SEOmoz (Linkscape/OSE)
The percentages of quantity will match up fairly well for these, but not for Majestic (e.g. they might show 5,000 links for xyz.com and 3,500 for site abc.com).
This is a bit odd, but we don't know exactly why. They crawl a ton more links than even what Google/Yahoo!/Bing reportedly do, which could be part of it, but my best guess is the canonicalization and freshness issues. Since MJ crawls the web all the time, and doesn't build "indices" every X time period (like Google/Yahoo!/Bing/Linkscape), but rather maintains a single consistent link index to which new sites/links are added, the data structures may be different.
Majestic also appears, at least to us, to do far less canonicalization, removal of unnecessary URL parameters, etc. Thus, if one site links from 50,000 pages due to weird session IDs in the URL, that might bias the crawl and link count, but standard canonicalization will normalize these.
I really don't mean to bash on a competitor - MJ's fresh index is awesome, their tools are good, and a ton of SEO folks find them useful, including myself. But, on the index matching and link count numbers, we definitely see this same weirdness that many other SEOs do.
Hope that helps!
-
RE: Is this Directory Guide by SEOmoz still accurate?
Just FYI (as an update), I met last week with some folks and spec'd out a project to replace the directory list. We plan to have an updated version ready to launch in the next 60 days. It will be WAY better, have some very cool interactive functionality, and feature three sources - web, social + local directories (all of which will have subcategories, too).
I think this replacement will be awesome and can last for years to come.
-
RE: Need help interpreting Ranking Factors.
Domain Authority should be a slightly better measurement of a domain's ability to get a page ranked (e.g. all other things being equal, a page on a DA 70 site should outrank that same page on a DA 60 site).
Domain mozRank is a rougher algorithm, based solely on the links pointing to the domain and how important those are (it's just PageRank on the domain link graph). DA actually takes DmR into account in its machine learning system.
-
RE: Need help interpreting Ranking Factors.
Hi Joe - a few items on our metrics:
- Domain mozRank basically measures PageRank on the domain link level (only looking at links between domains and pretending there's no individual pages).
- mozRank is a page level metric (like PageRank) and just measures raw link juice
- Domain Authority uses a mashup of every metric we've got (hundreds) plus thousands of derivatives of those metrics to create a score that best predicts ranking ability of a page/site for a random keyphrase (so it doesn't include things like anchor text or on-page, because it doesn't know what a page might want to rank for).
These scores can be quite different across different sites, and you can be higher in one and lower in another. I wouldn't stress too much about the comparisons, though. If you're losing in the rankings to a site with better metrics, they may partially help explain why, but the numbers themselves won't do anything for you, so I wouldn't worry about optimizing toward them.
Also - for every one of these metrics, more good links from high quality, trustworthy sources will help (and they'll likely help your Google rankings, too). Just make sure to give appropriate time - Linkscape's metrics update once per month with data from the prior month, so it could be 40-60 days before you see your link building efforts reflected entirely/accurately in the numbers.
Cheers!
-
RE: Site just will not be reincluded in Google's Index
I hear you - I suspect content quality could go up quite a bit with creativity, some work on the design/layout, etc. Having "more unique content" than competitors is quite a bit different than having an amazing resource that every parent wants to share with their friends because it's so phenomenal.
Re: the domain name - sadly, that might mean you need to slog through every link you've acquired and get rid of it, just to earn the clean slate Google seems to be demanding.
Good luck David!
-
RE: Site just will not be reincluded in Google's Index
Hi David - there's only a few things it could be, since you've filed for re-inclusion and not gotten back in:
- On-site spam/manipulation
- Cloaking/redirect stuff
- Backlink spam
I think, like others who answered above, the third one is the most likely. This leaves you with two options - try to get all the manipulative links removed entirely (apparently, Google doesn't think as of your last re-consideration you've gone far enough) or redirect the site to a new domain and start over with SEO.
If I were in your position, I'd probably do the latter, just because even if I could clean everything up, it might take months or even years for Google to review and agree to lift those penalties.
One last thing - it's also possible that Google's keeping the site out of the index because they don't think there's enough unique value in the content. You could try making a more unique, useful site and see if that helps/works, too (I'd probably recommended this anyway for a future version).
-
RE: Wrong types of questions...
I have to admit that while it can be frustrating to see some of those very basic questions here (and around the web), it's also inspiring to see how many great people jump in to help point folks in the right direction.
Obviously, a company, particularly a venture-backed one like ours, is supposed to earn revenue, scale, etc. but the best and most exciting part for me is feeling like we really can make a difference in our mission to educate and provide the tools for smart, dedicated people to become talented marketers. I'm humbled by how much time and effort is put in by so many people here in Q+A, and hopeful that we really might be moving the needle toward making people across the web better at this challenging process.
In terms of the "evolution" you speak of, Steve, it's my sense that we've got some opportunity there, too. Linkscape + OSE (and competitors like Majestic) helped provide some metrics that shifted a bit of focus off of PageRank and made SEOs think more critically about how we evaluate pages and links. I want to believe we could do that with KPIs if we built the right kinds of reporting dashboards into our software and opened access to more people as well.
If you have suggestions, we're all ears (you can email me personally, too).
-
RE: Google shows the wrong domain for client's homepage
Fascinating... Thanks for the follow-up; it's good to hear, albeit troubling that it took Google so long.
-
RE: Google shows the wrong domain for client's homepage
You should be able to manually outrank them simply by building up links, trust signals, social data, etc. to the original domain. That said, I'd probably also post about this in the Webmaster Help forum at Google and maybe submit something through your Webmaster Tools reconsideration request system, too. Google has said these 302 hijacks are very rare nowadways, but that they do want to know about them.
-
RE: Why aren't DMOZ links showing up in Open Site Explorer?
Several reasons on this one.
#1 - DMOZ is big, and while we crawl a lot of it, we don't crawl the whole thing, so some deep categories may be excluded.
#2 - We update once per month, so if it's a new addition, it may take some time
#3 - If DMOZ blocks bots or restricts the crawl (or some dup content or other issue), that could cause problems, too.
We have a larger index launching in July and hopefully will be going much deeper on sites like this, too.
-
RE: Why should your title and H1 tag be different?
I don't know... There's a surprising number of people who've reported hearing Matt say things. Yet, somehow, whenever there's video of him, he magically says next to nothing. I'd be skeptical at best.
-
RE: Why should your title and H1 tag be different?
Just want to point out that personally, I disagree with that assessment and haven't seen anything data-wise to suggest it's an issue. It's hard to believe that Google/Bing would want to penalize so many millions of sites that do this by default (news sites, Wordpress, Joomla, Drupal, etc. all have it in default settings either in base or plugins).
That said, Todd usually has good reasons for his recommendations, so would be interesting to probe more deeply.
-
RE: Why should your title and H1 tag be different?
Wow - surprisingly good topic for such a relatively basic part of SEO!
So... I think Todd Malicoat and I still disagree. He likes to have a different title + H1 and claims they're good for rankings and keyword diversity. I largely disagree based on user experience and the relative unimportance of H1s (you can see from our correlation analyses and our ranking models work that H1s appear to have virtually no advantage over just having keywords at the top of a page in large text).
My view is that when someone clicks on a search result listing, they expect to find the thing they've just clicked on. The title is what shows in the SERPs, but if the H1 is substantively different, they're getting what feels like a somewhat different page. That dis-congruous experience can result in high bounce rates and in searcher dis-satisfaction.
In addition, I'm not convinced there's a measurable benefit from differentiated titles vs. H1s. No search engine rep has given guidance on this (in fact, they've stayed conspicuously quiet over the years about whether the H1 does anything at all).
So - there you have it - a small controversy on a small point of on-page optimization. I think the best practice is to do what feels right (neither Todd nor I think the other's opinion will have a negative impact) and, if you're uncertain, test it out on different sets of pages.
My general view though is that there's far better uses of most SEOs time than worrying about H1s
-
RE: Is there anyway for redirected links to still provide SEO value?
Hi Spencer - I think there's some awkward phrasing combined with the challenge of parsing the true meaning/intent of your question on this one. I'll do my best to answer what I think you're asking.
A shortened link, by default, does not lose its ability to pass link juice, PageRank, trust metrics, anchor text signals or anything else an engine might associate with a link. If it did, all these years, our TinyURL links (which existed long before any social stuff) and all those 301 redirects (which are essentially how shortened URLs function) would have failed. Clearly, they didn't, nor do bit.ly, j.mp, t.co, etc. type links today.
If you're asking if, by placing a shortened URL on a normal webpage and linking to it, the target of the 301 redirect loses out compared to a direct link, the answer is no. If you're asking whether nofollowed links in Twitter tweets or profiles that contain shortened URLs (or that exist elsewhere in the social web and may not be followed or even crawlable by engines) lose value, the answer is "it depends," but also "probably."
All that said, at one point in time, a Google representative did note that 301 redirects and rel=canonical tags do lose a small amount of the PageRank they pass to another page compared to a non-redirect/canonical. We're of the strong opinion this is between 1-10% of the PageRank value, though we also suspect that other link signals, many of which are often more important than PageRank nowadays, are unaffected. This is my opinion only, and we can't know for sure whether Google still puts this slight dampening on redirects/canonicals.
Hope that helps!
-
RE: Convince me to stay! How should I best use SEOMoz tools.
The web app is where I find much of the value (though I do love OSE, the mozBar and many other one-off tools). For me, it's about knowing that my site's SEO is safe. I can watch keyword rankings, traffic, crawl data, link data (and soon, social metrics + citations) all in one place and reverse out when a problem arises what's happened or ID why something went well. I can also see low-hanging fruit by ID'ing the keywords I haven't optimized for but rank on page 2 or 3.
I'm a weird case, because the webinars, content, etc. are more produced by me than for me But, Q+A is pretty amazing for keeping up to date on SEO, and the weekly crawl + rankings are essential as KPIs and protection for search traffic.