Rick - I love what you're doing. Quick question - would you be OK with me writing a blog post for the main blog about your site and the SEO recommendations I'd have? Think it would make for a great case study style post.
Best posts made by randfish
-
RE: Hoping someone could take some time to give me some feedback / advice. Thanks!
-
RE: Moz's official stance on Subdomain vs Subfolder - does it need updating?
UPDATE: I filmed a Whiteboard Friday video specifically on this topic, with a few examples, that's likely worth checking out.
Hi James - I would still strongly urge folks to keep all content on a single subdomain. We recently were able to test this using a subdomain on Moz itself (when moving our beginner's guide to SEO from guides.moz.com to the current URL http://moz.com/beginners-guide-to-seo). The results were astounding - rankings rose dramatically across the board for every keyword we tracked to the pages.
I've had the opportunity to see many dozens of other sites do the same, almost always with similarly positive results (assuming they're moving from a subdomain without much other content/link signals to the subdomain that has those signals).
I think the important word you used in describing Matt's video is "implied." He's very careful not to speak in specifics, and often, I think the truth is buried in that non-specific language, rather than in the broader implied phrasing. That said, I do agree with you that after all these years, it seems odd that Google is still behaving in this fashion and that moving from one subdomain to another can have such a dramaticly positive impact on rankings.
p.s. Yes, for devblog, we put it there due to technical limitations. We plan to eventually get it moved to the main site.
-
RE: Why should your title and H1 tag be different?
Wow - surprisingly good topic for such a relatively basic part of SEO!
So... I think Todd Malicoat and I still disagree. He likes to have a different title + H1 and claims they're good for rankings and keyword diversity. I largely disagree based on user experience and the relative unimportance of H1s (you can see from our correlation analyses and our ranking models work that H1s appear to have virtually no advantage over just having keywords at the top of a page in large text).
My view is that when someone clicks on a search result listing, they expect to find the thing they've just clicked on. The title is what shows in the SERPs, but if the H1 is substantively different, they're getting what feels like a somewhat different page. That dis-congruous experience can result in high bounce rates and in searcher dis-satisfaction.
In addition, I'm not convinced there's a measurable benefit from differentiated titles vs. H1s. No search engine rep has given guidance on this (in fact, they've stayed conspicuously quiet over the years about whether the H1 does anything at all).
So - there you have it - a small controversy on a small point of on-page optimization. I think the best practice is to do what feels right (neither Todd nor I think the other's opinion will have a negative impact) and, if you're uncertain, test it out on different sets of pages.
My general view though is that there's far better uses of most SEOs time than worrying about H1s
-
The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Howdy Moz friends,
Today, the Moz team is making a new tool -- Link Explorer -- available in private beta for Moz Pro subscribers (including those taking a free trial). The tool is still a little ways out from public launch, but we wanted to get your feedback to help make it the best product it can be.
What's Link Explorer?
In essence, it's a replacement for Open Site Explorer (Moz's tool for link discovery, competitive analysis, and link building) that addresses many of its most pernicious challenges, such as:
-
Daily updates - no longer will you need to wait a month or two to see a new DA score or the links you built last week. Link Explorer updates every 24 hours with all the new links we've discovered that day, and gives a new DA score each night.
-
A MUCH bigger index - OSE has always been known for having quality links, but quantity has been an issue. No longer. The new tool's link index is more than 20 times larger than Open Site Explorer's, covering trillions of links across hundreds of billions of pages, while maintaining a focus on high quality domains and pages.
-
Additional functionality - new graphs (like link growth over time), new charts (like gained+lost links), new filters and sorts, and some new kinds of data coming soon.
-
Improved metrics - Domain Authority and Page Authority have both been upgraded to have better correlations with Google's rankings (and they now update every 24 hours)
-
Insanely fast - page load times on the new tool are almost as fast as Google's results Less time waiting means more time to dig into results
-
Link Tracking Lists - check a box next to any links of interest and you can build lists in the tool to track them over time, see whether/when they link, prioritize your outreach efforts, and (in the future) get aggregated data and alerts about those links
There's much more to come, but we'd love for you to check out Link Explorer, find bugs, report things you love (and don't), and help us make it the best possible product for you and your teams.
You can leave feedback here in this Q+A thread, email help@moz.com, or send feedback through the feedback form in Link Explorer.
-
-
What's the most effective web marketing tactic you've seen or used that very few people know about?
I wanted to start a thread to share some of the really cool marketing tactics I've seen on the web that I think few folks are using, AND ask the community here what you've seen, too!
Some of my favorite undiscovered or less-used tactics include:
-
Making smart use of bios for conferences, events, interviews, etc. where folks ask you or your team members for a "bio" and you get to control the links, link targets, and anchor text. This is super powerful in my experience, so long as you have a moderately strong profile or regular participation in this type of stuff.
-
Price anchoring on conversion pages, e.g. http://www.trackur.com/options - note how they start with the highest price to help "anchor" the audience to bigger numbers. A great principle of psychology in action.
-
Using re-marketing to draw people to content rather than just purchase/conversion pages. The effectiveness of these is, I've heard, dramatically higher than the usual re-marketing campaigns that take you to a squeeze or purchase page. I can't share the example I'm thinking of, unfortunately, but I'd urge you to try it!
-
Get more social shares and clicks by SHARING MORE THAN ONCE! A lot of folks feel like they are burdening their audience on Twitter/Facebook/G+ or frustrating them if they post multiple times, when in fact, very, very few of your followers are online at any given time. I've tested this myself and I get almost no negative feedback but can triple or more the number of shares/+1s/likes/visits/etc I get just by sharing 2-3X! The key is not to be too repetitive or annoying, and to acknowledge past shares (at least for me). e.g. I'll say "my blog post from last night on XYZ" and get a ton more clicks.
What are your favorites? Please share!
-
-
RE: Is this Directory Guide by SEOmoz still accurate?
To be honest, it's more than a year out of date, and not the best resource. We've talked internally about a replacement, but need to spec it out and find a contractor (and we have so many other improvements/upgrades/features to PRO we're in the process of making).
That said, I'll ask the team to look into it further. I do think a great directory list would be a valuable part of PRO content.
postscript update: We have a project in process to replace this with something very cool, updatable and more scalable, too. I believe launch ETA is Q3 of 2011. Justin Briggs from Distilled and Cyrus Shepard from Moz are working on this together.
-
KW Explorer is Working to Disambiguate Keywords Google Merges Together
Hey gang,
Russ Jones from Moz has been doing a ton of heavy lifting work to try to get around the new problem posed by Google AdWords recent change to merged-keyword volume data. But, we're fighting back against this obfuscation in Keyword Explorer. I'm sharing two emails (slightly edited) from Russ about what we're doing here:
Introduction to the Problem:
Google Adwords Keyword Planner is the primary source for keyword search volume (how often a keyword is searched monthly on Google) for much of the search marketing industry. While Google has grouped together highly-similar terms for a while (especially misspellings), in June of 2016 they dramatically increased this keyword-grouping. This means similar phrases like "keyword rank", "keyword ranking" and "keyword rankings" would all be reported as having the same, combined search volume, rather than their individual search volumes. If you were to take Google's numbers at face value, you might think there are 3,000 searches per month for these 3 terms, when in reality there is only 1,000, divided amongst the 3 terms.
How we are addressing it:
Moz's Keyword Explorer uses a blend of data sources, not just Keyword Planner, to build our volume metrics. This gives us a distinct advantage in that we can adjust the volume of words that deviate dramatically in one data set verses another. Take for example the phrases "keyword rank", "keyword ranking", and "keyword rankings". While Google Keyword Planner might report all of these as having 1,000 searches per month, Moz Keyword Explorer can detect that these numbers are significantly higher than what our models would predict given our other data sets. We can then adjust the volume accordingly. Moreover, given our huge keyword data set, we can also identify grouped phrases (like these 3) and divide the volume proportionally to what we see in our other data sets. Thus, we address the grouping problem from multiple directions.
Here's email #2 from Russ, detailing more of how we're attacking this:
I have been working pretty much non-stop on this keyword volume disambiguation problem (finding the real search volume of individual keywords when Google clumps several together). I think I have settled on a pretty good solution and am working on getting it all in. For example...
Google Keyword Volume for the phrases "briefcase for women" and "briefcases for women" are both at 3600 because they have been lumped together. My disambiguation script says the singular (briefcase for women) should be 2731 and the plural should be 869. Google Trends roughly agrees with this, showing that the singular is searched more than 2x the plural: https://www.google.com/trends/explore#q=briefcase%20for%20women%2C%20briefcases%20for%20women&cmpt=q&tz=Etc%2FGMT%2B4
Basically, Keyword Explorer should already be providing some more accurate/segmented numbers than AdWords, and in the future, we'll get even better thanks to our clickstream data and our evolving models.
Any questions, let us know!
-
RE: How do I speak to Rand ?
He's evil and bad! Don't try to contact him or you'll be turned to the dark side, too!
Oh no wait... I'm thinking of a different guy.
Just email me! My email's all over the web and I respond to 99% of real emails I get - rand@seomoz.org
-
Tactics to Influence Keywords in Google's "Search Suggest" / Autocomplete in Instant?
Have you had success with any particular methodologies that you'd recommend?
-
RE: Big drop in Domain Authority
Hey gang - thought I'd jump in with some official word from the Mozscape/Big Data team:
This latest index is smaller than prior ones, meaning we indexed fewer webpages total. However, the quality and importance of those pages in general is higher. In particular we've cut out a exceedingly large number of pages and subdomains on many Chinese sites that appeared to be biasing our crawl priorities and giving us some serious processing trouble.
DA, PA, and link metrics have maintained very similar correlations with Google rankings in this index, so if you've seen a large drop in either, it may be related to the removal of links that Google may not have been counting very highly. However, it's also possible that you've lost DA/PA from links that Google did count and Moz should be, too. As we regrow our index size in the next 2-3 updates, you may see a return of those scores. We do expect the next few indices to process much more quickly than the lag we experienced in the last few, and are watching indices very closely to make sure we're on the right track.
Also, with DA/PA drops, note that every index these occur, primarily because the sites and pages at the very top of the metrics scale (with PA/DA scores in the 99-100 range) are growing their link profiles massively, thus stretching what it means to have those incredibly high scores. If you had a DA of 90, and gained great links at the same rate you did last year, but many other DA 90+ sites were growing their link profiles even more rapidly (which tends to be how the web goes - the rich get richer, faster, every month), your DA would likely fall a few points even though you technically are still growing your link profile. DA/PA of 100 gets harder and harder to achieve every index because of the rate of growth of pages like Twitter.com, Google.com, and Facebook.com.
-
Keyword Explorer is Now Live; Ask Me Anything About It!
Howdy gang - as you probably saw, we launched our biggest new tool in Pro in many years today: https://moz.com/explorer
If you're a Moz Pro subscriber, you've already got access. We went ahead and gave folks who were at $99/month before today 300 queries/month. If you're signing up new, $99/month doesn't have KW Explorer access, but the other levels - at $149/month and above, do (5,000+ queries/month).
You can read the blog post here for lots of details, but if you have questions or product suggestions, please don't hesitate to ask!
-
DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
Howdy folks,
Every time we do an index update here at Moz, we get a tremendous number of questions about Domain Authority (DA) and Page Authority (PA) scores fluctuating. Typically, each index (which release approximately monthly), many billions of sites will see their scores go up, while others will go down. If your score has gone up or down, there are many potential influencing factors:
- You've earned relatively more or less links over the course of the last 30-90 days.
Remember that, because Mozscape indices take 3-4 weeks to process, the data collected in an index is between ~21-90 days old. Even on the day of release, the newest link data you'll see was crawled ~21 days ago, and can go as far back as 90 days (the oldest crawlsets we include in processing). If you've done very recent link growth (or shrinkage) that won't be seen by our index until we've crawled and processed the next index. - You've earned more links, but the highest authority sites have grown their link profile even more
Since Domain and Page Authority are on a 100-page scale, the very top of that represents the most link-rich sites and pages, and nearly every index, it's harder and harder to get these high scores and sites, on average, that aren't growing their link profiles substantively will see PA/DA drops. This is because of the scaling process - if Facebook.com (currently with a DA of 100) grows its link profile massively, that becomes the new DA 100, and it will be harder for other sites that aren't growing quality links as fast to get from 99 to 100 or even from 89 to 90. This is true across the scale of DA/PA, and makes it critical to measure a site's DA and a page's PA against the competition, not just trended against itself. You could earn loads of great links, and still see a DA drop due to these scaling types of features. Always compare against similar sites and pages to get the best sense of relative performance, since DA/PA are relative, not absolute scores. - The links you've earned are from places that we haven't seen correlate well with higher Google rankings
PA/DA are created using a machine-learning algorithm whose training set is search results in Google. Over time, as Google gets pickier about which types of links it counts, and as Mozscape picks up on those changes, PA/DA scores will change to reflect it. Thus, lots of low quality links or links from domains that don't seem to influence Google's rankings are likely to not have a positive effect on PA/DA. On the flip side, you could do no link growth whatsoever and see rising PA/DA scores if the links from the sites/pages you already have appear to be growing in importance in influencing Google's rankings. - We've done a better or worse job crawling sites/pages that have links to you (or don't)
Moz is constantly working to improve the shape of our index - choosing which pages to crawl and which to ignore. Our goal is to build the most "Google-shaped" index we can, representative of what Google keeps in their main index and counts as valuable/important links that influence rankings. We make tweaks aimed at this goal each index cycle, but not always perfectly (you can see that in 2015, we crawled a ton more domains, but found that many of those were, in fact, low quality and not valuable, thus we stopped). Moz's crawlers can crawl the web extremely fast and efficiently, but our processing time prevents us from building as large an index as we'd like and as large as our competitors (you will see more links represented in both Ahrefs and Majestic, two competitors to Mozscape that I recommend). Moz calculates valuable metrics that these others do not (like PA/DA, MozRank, MozTrust, Spam Score, etc), but these metrics require hundreds of hours of processing and that time scales linearly with the size of the index, which means we have to stay smaller in order to calculate them. Long term, we are building a new indexing system that can process in real time and scale much larger, but this is a massive undertaking and is still a long time away. In the meantime, as our crawl shape changes to imitate Google, we may miss links that point to a site or page, and/or overindex a section of the web that points to sites/pages, causing fluctuations in link metrics. If you'd like to insure that a URL will be crawled, you can visit that page with the Mozbar or search for it in OSE, and during the next index cycle (or, possibly 2 index cycles depending on where we are in the process), we'll crawl that page and include it. We've found this does not bias our index since these requests represent tiny fractions of a percent of the overall index (<0.1% in total).
My strongest suggestion if you ever have the concern/question "Why did my PA/DA drop?!" is to always compare against a set of competing sites/pages. If most of your competitors fell as well, it's more likely related to relative scaling or crawl biasing issues, not to anything you've done. Remember that DA/PA are relative metrics, not absolute! That means you can be improving links and rankings and STILL see a falling DA score, but, due to how DA is scaled, the score in aggregate may be better predictive of Google's rankings.
You can also pay attention to our coverage of Google metrics, which we report with each index, and to our correlations with rankings metrics. If these fall, it means Mozscape has gotten less Google-shaped and less representative of what influences rankings. If they rise, it means Mozscape has gotten better. Obviously, our goal is to consistently improve, but we can't be sure that every variation we attempt will have universally positive impacts until we measure them.
Thanks for reading through, and if you have any questions, please leave them for us below. I'll do my best to follow up quickly.
- You've earned relatively more or less links over the course of the last 30-90 days.
-
RE: SEO and Squarespace? Is this Really an Option?
Hi Virginia - happy to give my $0.02. Basically, on SquareSpace 6 (the active version out now), I think they've done a solid job with SEO features and functionality. I actually consulted a bit (informally - not paid, just helping out because I want folks to provide good SEO, especially popular CMS') with the SquareSpace team, and reviewed some of their implementations. It's good stuff, and SquareSpace is a good company (good customer service, honorable folks, good about refunds, excellent with uptime, etc).
That said, you can certainly get more flexibility by hosting your own system. Wordpress enables a lot of this, especially if you have a good developer making changes to it. Out of the box, SquareSpace is friendlier on many aspects of SEO than Wordpress, but with customizations, the latter can exceed the former.
One last word of advice - be cautious about trusting all the forum chatter, especially the stuff that comes from SquareSpace v6 and earlier (which wasn't very SEO friendly). I don't mean to be a pure advocate/defendent of SquareSpace (and I have no financial or other interest in the company), but do want to be fair to the strides they've made.
Hope that helps!
-
What is a Good Keyword Organic CTR Score?
Hi Folks! You might have seen my discussion on What Is a Good Keyword Difficulty Score, and this is a continuation of the same vein. Keyword Organic CTR is probably my favorite score we developed in Keyword Explorer and Moz Pro. It looks at the SERP features that appear in a set of results (e.g. an image block, AdWords ads, a featured snippet, or knowledge graph) and then calculates, using CTRs we built off our partnership with Jumpshot's clickstream data, what percent of searchers are likely to click on the organic, web results.
For example, in a search query like Nuoc Cham Ingredients, you've got a featured snippet and then a "People Also Ask" feature above the web results, and thus, Keyword Explorer is giving me an Organic CTR Score of 64. This translates directly to an estimated 64% click-through rate to the web results.
Compare that to a search query like Fabric Printed Off Grain, where there's a single SERP feature - just the "People Also Ask" box, and it's between the 6th and 7th result. In this case, Keyword Explorer shows an Organic CTR Score of 94, because we estimate that those PAAs are only taking 6% of the available clicks.
There are two smart ways you should be using Organic CTR Score:
- As a way to modify the estimated volume and estimated value of ranking in the web results for a given keyword term/phrase (KW Explorer does this for you if you use the "Lists" and sort based on Potential, which factors in all the other scores, including volume, difficulty, and organic CTR)
- As a way to identify SEO opportunities outside the normal, organic web results in other SERP features (e.g. in the Nuoc Cham Ingredients SERPs, there's serious opportunity to take over that featured snippet and get some great traffic)
OK, so all that said, what's actually a "good" Organic CTR score? Well... If you're doing classic, 10-blue-links style SEO only, 100 is what you want. But, if you're optimizing for SERP features, and you appear in a featured snippet or the image block or top stories or any of those others, you'd probably be very happy to find that CTR was going to those non-web-results sections, and scores in the 40s or 50s would be great (so long as you appear in the right features).
-
September's Mozscape Update Broke; We're Building a New Index
Hey gang,
I hate to write to you all again with more bad news, but such is life. Our big data team produced an index this week but, upon analysis, found that our crawlers had encountered a massive number of non-200 URLs, which meant this index was not only smaller, but also weirdly biased. PA and DA scores were way off, coverage of the right URLs went haywire, and our metrics that we use to gauge quality told us this index simply was not good enough to launch. Thus, we're in the process of rebuilding an index as fast as possible, but this takes, at minimum 19-20 days, and may take as long as 30 days.
This sucks. There's no excuse. We need to do better and we owe all of you and all of the folks who use Mozscape better, more reliable updates. I'm embarassed and so is the team. We all want to deliver the best product, but continue to find problems we didn't account for, and have to go back and build systems in our software to look for them.
In the spirit of transparency (not as an excuse), the problem appears to be a large number of new subdomains that found their way into our crawlers and exposed us to issues fetching robots.txt files that timed out and stalled our crawlers. In addition, some new portions of the link graph we crawled exposed us to websites/pages that we need to find ways to exclude, as these abuse our metrics for prioritizing crawls (aka PageRank, much like Google, but they're obviously much more sophisticated and experienced with this) and bias us to junky stuff which keeps us from getting to the good stuff we need.
We have dozens of ideas to fix this, and we've managed to fix problems like this in the past (prior issues like .cn domains overwhelming our index, link wheels and webspam holes, etc plagued us and have been addressed, but every couple indices it seems we face a new challenge like this). Our biggest issue is one of monitoring and processing times. We don't see what's in a web index until it's finished processing, which means we don't know if we're building a good index until it's done. It's a lot of work to re-build the processing system so there can be visibility at checkpoints, but that appears to be necessary right now. Unfortunately, it takes time away from building the new, realtime version of our index (which is what we really want to finish and launch!). Such is the frustration of trying to tweak an old system while simultaneously working on a new, better one. Tradeoffs have to be made.
For now, we're prioritizing fixing the old Mozscape system, getting a new index out as soon as possible, and then working to improve visibility and our crawl rules.
I'm happy to answer any and all questions, and you have my deep, regretful apologies for once again letting you down. We will continue to do everything in our power to improve and fix these ongoing problems.
-
What is a Good Keyword Volume Score?
Hi All!
Continuing my series of discussions about the various keyword scores we use here at Moz (previously: Keyword Difficulty & Keyword Opportunity)... Let's move on to Volume.
Volume in Moz's tools is expressed in a range, e.g. Bartending Certification has volume of 201-500. These ranges correspond to data we have suggesting that in an average month, that keyword is searched for a minimum of X to a maximum of Y (where X-Y is the volume range). We use clickstream data as well as data from Google AdWords and then some PPC AdWords campaigns we run and have access to when we build the models for our volume data. As such, we've got very high confidence in these numbers -- 95%+ of the time, a given keyword's monthly search volume on Google will fall inside that range.
If you want to see all the nitty gritty details, check out Russ Jones post on Moz's Keyword Volume and how we calculate it.
As far as a "good" volume score -- higher is usually better, as it means more demand, but lots of keywords with low volume scores can also add up to strong traffic when combined, and they may be more relevant. Capturing exactly the audience you want that also wants you is what SEO is all about.
p.s. When Keyword Explorer or Moz Pro gives you a "no data" or "unknown" volume number, it may just mean we haven't collected information from our clickstream providers or AdWords crawls, not that the keyword has no volume (though it sometimes means that, too, we just don't know yet). One way to verify - see if Google Suggest autofills it in when you type in the search box. If it does, that's usually a sign there's at least some volume (even if it's only a few searches a month).
-
RE: Is everybody seeing DA/PA-drops after last MOZ-api update?
Hi Niels - yep, I saw a bit of this too. I believe there's two causes:
-
We crawled a larger swath of the web in this index, so we captured more sites and more links, and that may mean the scaling of PA/DA (which are logarithmic) stretches to accommodate the larger number of links found, especially to sites at the top of the scale. For example, if Facebook has a DA of 100 with 5 Billion links, then we find 5 billion more links to it, Facebook still has a DA of 100, but it's a much higher threshold. Thus, sites with fewer links (and less quality links) will fall in DA as the scale is now stretched.
-
We crawled some weird stuff in this index, by mistake (or rather, because spammers built some nasty, deep crawl holes that Google probably didn't fall for but we did). A ton of odd domains on strange ccTLDs were seen, and crawled, because they gamed PageRank with lots of sketchy links. We've now excluded these for indices going forward, and hopefully will see the impact abate.
All that said, over time, as our index grows, you can expect that raw DA/PA numbers could get harder to achieve, meaning a lot of sites will drop in PA/DA (and some will grow too, as we discover more links to them in the broader web). My best advice is always to not use PA/DA as absolutes, but rather relative scores. That's how they're designed and how they work best.
It's like back when Google had PageRank, and Moz.com grew from PR4 to PR7, then as Google got bigger and bigger, and the web got bigger, Moz.com fell to PR5, even though we had way more links and ranked for way more stuff. The raw PR scale had just become stretched, so our PageRank fell, even though we'd been improving.
-
-
RE: Multiple H1 tags are OK according to developer. I have my doubts. Please advise...
Hi AWC - this is tangential to the topic, but important for Q+A and Moz community participation in general.
Please, in the future, work to be as generous and empathetic in replies as possible. This community is meant to be a haven from many of the nastier corners of the web and while your comment was not excessively insulting, it wasn't kind either. Contributions both big and small are welcome here, as are opinions.
If we're going to maintain the amazing community here, we have to be mindful about the impacts of negativity. Thanks for understanding.
-
RE: The Great Subdomain vs. Subfolder Debate, what is the best answer?
Hi Rosemary - thankfully, I have data, not just opinions to back up my arguments:
- In 2014, Moz moved our Beginner's Guide to SEO from guides.moz.com to moz.com itself. Rankings rose immediately, with no other changes. We ranked higher not only for "seo guide" (outranking Google themselves) but also for "beginners guide" a very broad phrase.
- Check out https://iwantmyname.com/blog/2015/01/seo-penalties-of-moving-our-blog-to-a-subdomain.html - goes into very clear detail about how what Google says about subdomains doesn't match up with realities
- Check out some additional great comments in this thread, including a number from site owners who moved away from subdomains and saw ranking benefits, or who moved to them and saw ranking losses: https://inbound.org/discuss/it-s-2014-what-s-the-latest-thinking-on-sub-domains-vs-sub-directories
- There's another good thread (with some more examples) here: https://inbound.org/blog/the-sub-domain-vs-sub-directory-seo-debate-explained-in-one-flow-chart
Ultimately, it's up to you. I understand that Google's representatives have the authority of working at Google going for them, but I also believe they're wrong. It could be that there's no specific element that penalized subdomains and maybe they're viewed the same in Google's thinking, but there are real ways in which subdomains inherit authority that stay unique to those subdomains and it IS NOT passed between multiple subdomains evenly or equally. I have no horse in this race other than to want to help you and other site owners from struggling against rankings losses - and we've just seen too many when moving to a subdomain and too many gains moving to a subfolder not to be wary.
-
RE: Is this Directory Guide by SEOmoz still accurate?
Just FYI (as an update), I met last week with some folks and spec'd out a project to replace the directory list. We plan to have an updated version ready to launch in the next 60 days. It will be WAY better, have some very cool interactive functionality, and feature three sources - web, social + local directories (all of which will have subcategories, too).
I think this replacement will be awesome and can last for years to come.
-
RE: Do you think Seomoz is worth the monthly fee if you're not a professional SEO ?
Hi Alan - first off, welcome to the community at Moz! Great to have you.
We actually think a lot about this question in a broad way at the company. Right now, our focus is primarily on producing software that's ideal for professional marketers in consulting, agency, in-house or self-directed roles. Those who don't spend much/most of their time thinking about and working on content marketing, SEO, social media, analytics, etc. probably won't find SEOmoz to be an ideal fit.
In terms of justifying the fee - it really comes down to ROI. If you find yourself applying the data, tracking, tips and recommendations from both the software/tools and the community/content in a way that produces far more than $99/month in revenue, that's awesome. But if not, please don't feel bad at all about cancelling. We make it very easy to do so (right on your profile page) because we want folks to have a phenomenal, positive experience with SEOmoz, even if our product isn't right for you. And who knows, perhaps in months or years to come, you'll find that as your businesses grow, there's a greater need and come back.
I wish you luck whatever you choose - and please do let us know (through our feature request forum) if there are particular items you'd like to see.
Cheers,
Rand
-
What's the Story on Mozscape Updates?
Hey gang,
As you may be aware, we were considerably late with our last index release. You have my sincere apologies for that and the apologies of the entire team. In the interest of transparency, I want to try to explain what's been going on.
Since stepping down as CEO, I've been asked to take on a few roles in the company. One of those is product architect (basically the product owner) of our Big Data team, who produces the Mozscape link index. For several years, that team has been almost exclusively focused on getting us closer to a near real-time indexing system that does not have scalability issues. Mozscape is currently smaller than our major competitors, and we're also often slower. Our metrics (PA, DA, MozRank, MozTrust, Spam Score, Social Data, etc) have been the unique value we provide, but it's not enough. We need to be competitive on size and freshness.
Building a raw link index (without processed metrics like PA/DA et al) is hard, but it's possible. Building a link index with those metrics is really tricky, and requires computer science knowledge and skills far beyond the scope of my understanding. That's what our team's been working on, and they've made some progress, but it's been slow, hampered by unknown unknowns, and materially hurt by a lack of experienced talent we can hire to help (we've had open job posts for years now).
In the meantime, our historic Mozscape index structure keeps encountering challenges - this latest round is still somewhat unexplained (we believe there's hardware issues compounded by how the system is architected to handle large domains, but there may be other issues). The team's struggled to split time between keeping the old Mozscape running and hunkering down to finish the new system. I'm trying to help them balance things as best I can, and we're going to be putting effort toward making sure we get index releases out on time. However, to do that, we'll need to scale down size, and then rebuild back up. We think we can do this while also improving the prioritization of which links we crawl (e.g. deeper on important domains that link out, less so on deep pages that don't link anywhere) so the index overall improves.
However, I don't want to minimize the risks - we may have some slow updates, some smaller indices, and some less-than-ideal data in the next one or two indices while we work to remedy this issue. I HOPE we don't, and that things actually get better immediately, but we can't promise that until the work gets finished.
TL;DR - Mozscape V2 is in development and will let us as big and faster as any link index. In the meantime, current Mozscape's having issues & we're making smaller indices in an attempt to diagnose and repair.
As always, thanks for your understanding, continued support, and if you have any questions, feel free to leave them below. I realize that this level of service/product quality is NOT OK, and I'm doing everything in my power to fix it.
-
Have Questions about the Jan. 27th Mozscape Index Update? Get Answers Here!
Howdy y'all. I wanted to give a brief update (not quite worthy of a blog post, but more than would fit in a tweet) about the latest Mozscape index update.
On January 27th, we released our largest web index ever, with 285 Billion unique URLs, and 1.25 Trillion links. Our previous index was also a record at 217 Billion pages, but this one is another 30% bigger. That's all good news - it means more links that you're seeking are likely to be in this index, and link counts, on average, will go up.
There are two oddities about this index, however, that I should share:
The first is that we broke one particular view of data - 301'ing links sorted by Page Authority doesn't work in this index, so we've defaulted to sorting 301s by Domain Authority. That should be fixed in the next index, and from our analytics, doesn't appear to be a hugely popular view, so it shouldn't affect many folks (you can always export to CSV and re-sort by PA in Excel if you need, too - note that if you have more than 10K links, OSE will only export the first 10K, so if you need more data, check out the API).
The second is that we crawled a massively more diverse set of root domains than ever before. Whereas our previous index topped out at 192 million root domains, this latest one has 362 million (almost 1.9X as many unique, new domains we haven't crawled before). This means that DA and PA scores may fluctuate more than usual, as link diversity are big parts of those calculations and we've crawled a much larger swath of the deep, dark corners of the web (and non-US/non-.com domains, too). It also means that, for many of the big, more important sites on the web, we are crawling a little less deeply than we have in the past (the index grew by ~31% while the root domains grew by ~88%). Often, those deep pages on large sites do more internal than external linking, so this might not have a big impact, but it could depend on your field/niche and where your links come from.
As always, my best suggestion is to make sure to compare your link data against your competition - that's a great way to see how relative changes are occurring and whether, generally speaking, you're losing or gaining ground in your field.
If you have specific questions, feel free to leave them and I'll do my best to answer in a timely fashion. Thanks much!
p.s. You can always find information about our index updates here.
-
What is a Good Keyword Priority Score?
Howdy gang,
This is my last discussion post in the series on keyword metrics in KW Explorer & Moz Pro (previously on Keyword Difficulty, Opportunity, & Volume). In this one, let's chat about the "Priority Score," a feature you'll find in Keyword Explorer on any lists you build.
Priority was conceived to help aggregate all the other metrics - Difficulty, Opportunity, Volume, and (if you choose to use it) Importance. We wanted to create an easy way to sort keywords so the cream would rise to the top -- cream in this case being keywords with low difficulty, high opportunity, strong volume, and high importance (again, if you choose to use it). Thus, when it comes to Priority Score, there's no particular number you should necessarily seek out, but higher is better.
When you get into the ranges of 80+ (which is quite rare, Single Malt Scotch is one of the few examples I could find, and only because it's volume is so high and there's only a couple SERP features), you're generally talking about keywords with high demand (lots of monthly searches), the difficulty isn't too crazy (a website in the 55-80 DA range might have a shot), and the CTR Opportunity is decently strong (usually not too many SERP features that take clicks and attention away from the organic web results). Below that score range, you're usually finding keywords where one or more of those isn't true -- there's either lower volume, heavier competition, or lots of SERP features with the accompanying lower estimated CTR.
When you're building KW lists, my view is that there's no "good" or "bad" Priority scores, only relative scores. Priority should be used to help you determine which terms and phrases to target first -- it's like a cheat code to unlock the low hanging fruit. If you build large lists of 50-100 or more keywords, Priority is a powerful and easy way to sort. It becomes even more useful if you use the Importance score to help add an estimation of value to you/your business/your client in to the mix. In that case, Importance can cut Priority by up to 2/3rds (if you set it at 1) or raise it by a little more than 3X (if you set it at 10). This is hyper-useful to nudge keywords with middling scores up if they're super-important to your marketing efforts.
Look forward to your feedback, and thanks for checking these out!
-
August 3rd Mozscape Index Update (our largest index, but nearly a monthly late)
Update 5:27pm 8/4 - the data in Open Site Explorer is up-to-date, as is the API and Mozbar. Moz Analytics campaigns are currently loading in the new data, and all campaigns should be fully up-to-date by 4-10pm tomorrow (8/5). However, your campaign may have the new data much earlier as it depends on where that campaign falls in the update ordering.
Hey gang,
I wanted to provide some transparency into the latest index update, as well as give some information about our plans going forward with future indices.
The Good News: This index, now that it's delivered, is pretty impressive.
- Mozscape's August index is 407 Billion URLs in size, nearly 100 Billion (~25%) bigger than our last record index size. We indexed 2.18 trillion links for the first time ever (prior record was 1.54 trillion).
- Correlations for Page Authority have gone up from 0.319 to 0.333 in the latest index, suggesting that we're getting a slightly more accurate representation of Google's use of links in rankings from this data (DA correlations remain constant at 0.185)
- Our hit ratio for URLs in Google's SERPs has gone up considerably, from 69.97% in our previous index to 78.66% in the August update. This indicates we are crawling and indexing more of what Google shows in the search results (a good benchmark for us). Note that a large portion of what's missing will be things published in the last 30-60 days while we were processing the index (after crawling had stopped).
The Bad News: August's index was late by ~25 days.
We know that reliable, consistent, on-time Mozscape updates are critically important to everyone who uses Moz's products. We've been working hard for years to get these to a better place, but have struggled mightily. Our latest string of failures was completely new to the team - a bunch of problems and issues we've never seen before (some due to the index size, but many due to odd things like a massive group of what appear to be spam domains using the Palau TLD extension clogging up crawl/processing, large chunks of pages we crawled with 10s of thousands of links which slow down the MozRank calculations, etc). While there's no excuse for delays, and we don't want to pass these off as such, we do want to be transparent about why we were so late.
Our future plans include scaling back the index sizes a bit, dealing with the issues around spam domains, large link-list pages, some of the odd patterns we see in .pl and .cn domains, and taking one extra person from the Big Data team off of work on the new index system (which will be much larger and real-time rather than updated every 30 days) to help with Mozscape indices. We believe these efforts, and the new monitoring systems we've got will help us get better at producing high quality, consistent indices.
Question everyone always asks: Why did my PA/DA change?!
There are tons of reasons why these can change, and they don't necessarily mean anything bad about your site, your SEO efforts, or whether your links are helping you rank. PA and DA are predictive, correlated metrics that say nothing about how you're actually performing. They merely map better than most metrics to Google's global rankings across large SERP sets (but not necessarily your SERPs, which is what you should care about).
That said, here's some of the reasons PA/DA do shift:
- The domains/pages with the highest PA/DA scores gain even faster than most of the domains below them, making it harder each index to get higher scores (since PA/DA are on a logarithmic scale, this is smoothed out somewhat - it would be much worse on a conventional scale, e.g. Facebook.com 100, everyone else 0.0003).
- Google's ranking algorithm introduces new elements, changes, modifies what they care about, etc.
- Moz crawls a set of the web that does or doesn't include the pages that are more likely to point to a given domain than another. Although our crawl tends to be representative, if you've got lots of links from deep pages on less popular domains in a part of the web far from the mainstream, we may not consistently crawl those well (or, we could overcrawl your sector because it recently received powerful links from the center of the web).
My advice, as always, is to use PA/DA as relative scores. If your scores are falling, but your competitors' are falling more, that's not a bad thing. If your scores are rising, but your competitors' are rising faster, they're probably gaining ground on you. And, if you're talking about score changes in the 1-4 points range, that's not necessarily anything but noise. PA/DA scores often shift 1-4 points up or down in a new index so don't sweat it!
Let me know if you've got more questions and I'll do my best to answer. You can also refer to the API update page here: https://moz.com/products/api/updates
-
RE: What would be a really good reason to pay for SEOmoz Pro service?
Hey Robert - really appreciate both the kind words and the constructive feedback re: the member level structures for campaigns and sites. I'll talk to the team about the potential to allow for larger number of smaller sites. In the meantime, feel free to drop a line to Andrew@SEOmoz.org, who's working on projects around customizable pro memberships (it's a work in progress and requires a bunch of engineering stuff, too, so it's not available immediately, but is something we're working on).
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Hi Joy - yes, that's correct. The new DA you see in Link Explorer has a stronger correlation with Google's rankings (which is our bar for performance). Don't forget, too, that this new DA updates daily and is derived from a much, much larger group of links and sites (20X+ what OSE uses) than the previous index. So we're discovering a lot more links (to you and your competition), and we're using a better machine-learning system to calculate a score that's more representative of how the sites are likely to perform in Google's rankings (at least, based on the link metrics side of things).
-
RE: September's Mozscape Update Broke; We're Building a New Index
Hi Joe - fair question.
The basic story is - what the other link indices do (Ahrefs and Majestic) is unprocessed link crawling and serving. That's hard, but not really a problem for us. We do it fairly easily inside the "Just Discovered Links" tab. The problem is really with our metrics, which is what makes us unique and, IMO, uniquely useful.
But, metrics like MozRank, MozTrust, Spam Score, Page Authority, Domain Authority, etc. require processing - meaning all the links needed to be loaded into a series of high-powered machines and iterated on, ala the PageRank patent paper (although there are obviously other kinds of ways we do this for other kinds of metrics). Therein lies the rub. It's really, really hard to do this - takes lots of smart computer science folks, requires tons of powerful machines, takes a LONG time (17 days+ of processing at minimum to get all our metrics into API-shippable format). And, in the case where things break, what's worse is that it's very hard to stop and restart without losing work and very hard to check our work by looking at how processing is going while it's running.
This has been the weakness and big challenge of Mozscape the last few years, and why we've been trying to build a new, realtime version of the index that can process these metrics through newer, more sophisticated, predictive systems. It's been a huge struggle for us, but we're doing our best to improve and get back to a consistent, good place while we finish that new version.
tl;dr Moz's index isn't like others due to our metrics, which take lots of weird/different types of work, hence buying/partnering w/ other indices wouldn't make much sense at the moment.
-
RE: SEO for One Page Websites
When it comes to single page sites, make sure you're matching strategy appropriately. It's very hard to win the long tail, and there's often a lot of domain authority value (if the site/page earns lots of links) that goes unspent, meaning you could have more pages/content that could rank well if you added them to the site.
I like to think of these one-pagers as being great promotional vehicles that can later be wrapped into a larger marketing/brand effort by rel-canonicaling or 301'ing back to a reproduction of the content on your larger site. Maybe it's just the thrifty SEO in me, but I hate to see that domain authority wasted
-
RE: Can penalties be passed via 301 redirect?
I've seen a bunch of these and weirdly, the 301s do seem to (often) remove the penalty in cases where it's a true penalty. However, what you're describing sounds like it could just be a negation of the value of many external links (which is much more common than the actual "penalty" that downgrades you).
If that's the case, 301'ing likely won't do much positive or negative - it will pass on the "juice" that Google's still counting and thinks is legit, but probably not the devalued juice (though, to be honest, I've seen a few times when it has and black hats sometimes do use this strategy - constantly re-pointing stuff as it gets hit). This certainly isn't recommended, as eventually, you will have that "burnt-to-the-ground" effect. If you're looking to go clean and white hat on a different domain, and want to take some of the content and link efforts you have in the penalized site, that's certainly a way to go.
-
RE: Convince me to stay! How should I best use SEOMoz tools.
The web app is where I find much of the value (though I do love OSE, the mozBar and many other one-off tools). For me, it's about knowing that my site's SEO is safe. I can watch keyword rankings, traffic, crawl data, link data (and soon, social metrics + citations) all in one place and reverse out when a problem arises what's happened or ID why something went well. I can also see low-hanging fruit by ID'ing the keywords I haven't optimized for but rank on page 2 or 3.
I'm a weird case, because the webinars, content, etc. are more produced by me than for me But, Q+A is pretty amazing for keeping up to date on SEO, and the weekly crawl + rankings are essential as KPIs and protection for search traffic.
-
RE: DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
Thanks for the feedback Joseph - I appreciate your transparency and can totally empathize with the frustration.
I think the key here, unfortunately, is in understanding and effectively explaining how the metrics of DA and PA operate and why they're not like standard counts that always go up as things get better. Clearly, we need to do a better job of that.
A good metaphor might be how rankings work for countries in various categories. For example, if Japan is ranked as having the world's best healthcare in 2015, and they improve the quality of their healthcare in 2016, are they guaranteed to still be #1?
Not necessarily.
Maybe the #2 ranked country improved even more and now Japan has fallen from #1 to #2 despite actually improving on their healthcare quality. Maybe countries 2-10 all improved dramatically and Japan's now fallen to #11 even though they technically got better, not worse.
PA and DA work in a similar fashion. Since they're scaled on a 100-point system, after each update, the recalculations mean that PA/DA for a given page/site could go down even if that page/site has improved their link quantity and quality. Such is the nature of a relative, scaled system. This is why I encourage folks strongly to watch not just PA/DA for their own pages/sites, but for a variety of competitors and sites in similar niches to see whether you're losing or gaining ground broadly in your field.
Hope that's helpful and wish you all the best.
-
RE: Prior experience needed?
Jose - I won't recommend more resources, because those provided by Kevin and Patrick make for excellent lists. What I will say is that before you engage in SEO for clients or on your company's primary site, I might try to gain some experience by building a site of your own (or two) and trying things out - what kind of content can you make that engages an audience? Where can you earn links? Try out some keyword targeting and learn what level of competitiveness you're comfortable with (and note the differences between trying to rank for various terms/phrases). Ours is a field with a lot of trial and error, and finding your strengths through building a site of your own is, IMO, invaluable.
-
RE: Does a link in facebook count as a backlink?
It depends on the definition and intent of the word. In this case, I believe KevinBP wants to know whether those links will impact rankings in the same way that normal, followed links from other websites will/do. I believe we'd all agree that social media links with nofollow attributes do not contribute in those ways.
I'd also say, more broadly, that while many types of links will show up in Google's Webmaster Tools, this criteria alone does not indicate rank-boosting ability from those links or that Google is even counting them.
-
RE: What Questions Should I Be Asking?
Hi Kade - welcome to the Moz Q+A! Thrilled to have you here. I'll do my best on your three queries.
#1 - It's definitely possible, and often times, in early stages of a business, you've got to find proficiency in lots of areas, figure out what's critical/core to your business and then scale those individual functions to experts. As an example, at Moz, I started getting "good" at SEO when web design and usability consulting was my day job. As I got curious and better at it, that became more of my role until I couldn't scale anymore and we hired consultants to help. Even as recently as this year, we brought in Tom Critchlow from Distilled to work in-house at Moz for 3 months, helping our marketing team establish great strategy around lots of inbound initiatives (as I had other obligations as CEO).
#2 - There are usually some KPIs that are cross-industry and company and others that are very specific. For example, nearly everyone cares about visits and conversions (whatever a "conversion" might mean). But some companies care much more about pages per visit (particularly those that are ad-revenue based) or average membership lifetime (for those who have subscription models). Figuring out the KPIs for your organization is the first step to good analytics.
#3 - Weirdly, I've found that in the SEO field, 95%+ of the great, white hat information is shared publicly. It's often accompanied by other signals of trust - good-looking, professional websites, authored by well known and referenced industry authorities who speak at conferences and have impressive client lists. The ones you need to watch out for come from the two extremes of the spectrum - first, the mainstream media which, to my knowledge, has never done and effective job covering how SEO works or the tactics one should follow. The second are the low-quality "craphat SEOs" who play on ignorance and make "too-good-to-be-true" offers.
If you stick with sources like those covered here - http://www.seomoz.org/blog/best-seo-blogs-top-10-sources-to-stay-uptodate - you should be great. If you want a more expansive list, I actually like http://seo.alltop.com as well.
Hope this helps!
Rand
-
RE: Do you trust SEOMoz with your Google Analytics data?
Totally fair question - I'll add a few thoughts from our end:
- Right now, the only GA data we pull for accounts is what you see in the product. We're not taking out or using anything else behind the scenes, nor storing anything other than what you see. Being totally TAGFEE, I will say that in the future, we probably should start using some anonymous aggregations of data to help improve the product, run some testing and possibly long term, offer the ability to share your data anonymously in exchange for some sort of benchmarking/comparison (we'd obviously talk about this a lot more and you'd need to opt-in - we'd never do it without permission).
- Once an account is deleted, we remove its data within 6 months (sometimes sooner - only reason we keep it is in case of account re-activation, where folks don't want to lose stuff).
- We have network admins on call 24/7, so if anything unusual should happen, we can quickly address the problem.
- To date, we've had no intrusion attempts other than to the main WWW site (for injections of URLs - ugh to link spammers making the name "SEO" look bad).
- We have never sold ANY customer data ever to anyone for any reason, nor have we ever attempted or offered to do so. We do, obviously, make our link graph available via OSE, but that's public on the web (just hard to access in a scalable format).
I will ask one of our engineering folks to jump on this thread and provide some information about our security and encryption (probably not details, as that would be counter-productive, but at least a broad explanation).
My final note would be that traffic data via GA, while certainly important and private, hasn't typically been a target of hackers/malware/phishing schemes/etc. The value to outsiders is pretty minimal, even direct competitors (with a few rare exceptions).
-
RE: Big drop in Domain Authority
Hi Brian - that's not quite accurate. It isn't that DA scores are all going to go up - in fact, many of the ones that were higher in our last couple indices probably should have been lower, and this is more a correction/normalization. As I noted in another response here, following a single DA score can be very unrepresentative of reality vs. following many scores in a niche across competitors. DA fluctuates as Google updates their ranking algorithm because DA uses machine learning against Google's SERPs (and ~once a year) we retrain our model.
Domain Authority scores aren't great ways to know how you're performing in SEO in absence of context, but they can be very good to see how you're performing against other sites in your industry or with whom you're competing in search results.
Every new index we produce has lots of scores going up and down because the link profiles that correlate best with ranking higher in Google change, the links we discover change, the sites that get penalized or that grow rankings substantially change, etc. Domain Authority is less like a fixed metric that grows as you grow your link profile and more a relative metric that changes as the web and Google change.
Hope that helps!
-
RE: Is it possible to have good SEO without links and with only quality content?
Hi Alex - I actually filmed a whiteboard friday about this today! In the next few weeks, you should see it go to the main blog (and I cited you in there - hope that's OK)
-
RE: SEOMoz rocks, but is it breaking Google's webmaster guidelines when it comes to check rankings?
Hi Vinod - we use APIs for some things (e.g. AdWords data, social shares, some rankings stuff, etc), and direct crawling for others (e.g. Linkscape itself obviously, fetching weekly site crawls, etc). I believe Google's violations primarily apply to software one installs on one's own machine that runs automated queries. We don't do that (as there's nothing to download and install with Moz). I believe this is what got WebPosition called-out on Google's site.
We're in relatively good touch with folks at lots of the major services we use - Google, Twitter, Facebook, Bing, etc. Googlers have asked us to make some changes and we've obliged (e.g. removing PageRank scores from our Mozbar/PRO app).
Hope that helps!
-
RE: Good Content Writing | How do I convince them?
Hi Brian - I can't really get behind the idea of cheap, non-targeted, non-value-add content that's purely to drive clicks for SEO. Here's my issue with that logic:
- Pay $3, get a crappy-mediocre article
- Build links to your site that come from non-authentic, non-editorially endorsed sources
- Earn a few rankings and start getting traffic from SEO
- SEO traffic is high bounce rate, low satisfaction, and low conversion
- The engines eventually discount your links because they're non-editorial
- The poor user/usage metrics from the high bounce rate (due to low content quality) affect rankings badly as well
- Endgame: You've spent a small amount but gained very little
Compare this to high quality content tactics:
- Pay $1,000 to get a single fantastic article, maybe including an infographic or at least some great visuals and compelling research, unique viewpoints, an author with a brand name, etc.
- The content naturally attracts links, social shares, email traffic, etc.
- The clicks on it from all sources stay a while, read it, share it, spread it further and add more value to the site
- Readers bookmark, subscribe to the RSS feed (they don't want to miss more articles like this one) and come back regularly, building your traffic over time
- Search engines rank the content well based on all factors - links, social stuff, user/usage data, content analysis, etc.
- Your rankings stay steady when others drop, and you win as the engines get better at identifying the "good stuff"
- Users who find/visit the page think more highly of your brand and are more likely to convert, take action, etc.
- Endgame: 1 great article is worth hundreds of mediocre ones, and the traffic is high quality and valuable too
I'd always aim for the absolute highest quality possible. A single fantastic piece of content can drive so much value to a business on the web that it's never worthwhile, IMO, to underinvest here.
Just my $0.02!
Also - some good posts on this topic:
http://www.seomoz.org/blog/great-content-for-seo-simpler-than-you-ever-imagined
http://www.seomoz.org/blog/debating-the-value-of-great-content
-
RE: SEOMOZ can we trust you?
Hi Eric - certainly a fair question.
We don't currently sell or give any information about campaigns to anyone else, and we know how much of a violation of trust and privacy that would be, so we're extremely careful about any way we might use data in the future. As an example of this:
When we do correlation data to test our metrics for Mozscape (PA/DA), we actually run these against a completely unique set from what our members track. Given how big that dataset's getting, it's probably no longer a key piece of anonymization, but when we were smaller, it could have been. We are focused on keeping your data in your hands.
Our revenues, as you can see from my recent post on our investment, come exclusively from PRO memberships (~90%), Mozscape API (~8%) and Mozcon (~2%). We don't do any consulting or any special side deals. We have no interest in getting into other businesses or compromising the trust we've built with our members - YOU are our primary business model and will be for the long term. YOU are how we want to grow to $1billion+ and thus protecting your data and your opinion/trust of us is mission critical to the business.
All that said, there are plans in the future to build some products that may allow our members to opt in to data sharing (e.g. benchmarking types of things or submitting your data for broader scale analysis), but these are merely ideas in our head at this point, and we'd have a careful process so you'd know exactly how your data might be used and could choose whether to participate or not (sort of like how Google Analytics used to offer an opt-in industry comparison service).
Hope this helps, and if you've got specific questions, please don't hesitate to ask!
p.s. oh - and with regards to Google - they've never asked us for data or bought anything from us, but of course, may be getting Mozscape stuff or other data through use of our tools (but only what's available to members, nothing private).
-
RE: Private Blogging Network
Hey Jen - I'd be very, very wary of doing this, given how much Google's been cracking down on blog networks, e.g. https://www.google.com/search?q=private+blog+network&tbs=qdr:w
I've also seen Google's responses to re-consideration requests include statements to the effect that the penalties may be lifted faster if the abuser includes details on what SEO services, blog networks, and links they used/acquired. Thus, Google's creating a strong incentive for anyone penalized to share their acquisition techniques privately. My guess is this will mean lots more penalties and devaluation coming soon.
If possible, I'd urge you to consider spending the time, resources and energy you'd give to this project on something else.
I'd also probably say that SEOmoz public Q+A might not be the most ideal place to conspire to build black hat link networks (for obvious reasons)
-
RE: Spam score question on shortened URLs
Hi Overthet0p - no, I wouldn't worry about the URL shortener at all. Remember that spam score doesn't indicate whether something is necessarily spam in Google's eyes - it just tries to show features that are correlated with things we've seen Google penalize/ban. So, a Spam Score of 9/17 means that we saw ~72% of sites with that many triggers get penalties/bans in Google. But, that also means 28% of sites we saw with 9/17 flags had no penalties. If you've manually checked the site and feel it's not a problem, changes are it's fine!
-
RE: Crowdsearch.me - Is this a legit approach?
Totally agree with Ray that this isn't a legitimate tactic, nor would I expect it to work. Google's got a lot of defenses and checks to prevent manipulation of this kind, so while it could have an impact briefly and in some SERPs, I'd expect it to be mostly a waste of time and money.
The only part I'll disagree with is Google's disclosure that they do (or rather "might") use pogo-sticking. I believe this was mentioned at a conference last year or in 2013, though I can't find the reference now. There's also lots of test evidence, including the experiment I ran live at Mozcon, this one from my blog: http://moz.com/rand/queries-clicks-influence-googles-results/ (which I did repeat with success), and some mixed results from Darren Shaw here: http://www.slideshare.net/darrenshaw1/darren-shaw-user-behavior-and-local-search-dallas-state-of-search-2014.
Queries and clicks are most certainly impacting rankings, though how directly and with what caveats/other influences we don't yet know (and may never).
-
RE: Thoughts on scraping SERPs and APIs
Hi Boogily,
A) In terms of ToS, I'm not 100% clear. I know that Google has been strict about automated rank checking programs installed on a user's local machine, and they've enforced that a bit through technical and guidelines means. However, they've been unclear about companies like Moz, Searchmetrics, GetStat, AuthorityLabs, Conductor, Brightedge, Yield, RankAbove, and hundreds of others which provide this data in their software. I suspect they don't love it, but tolerate it and haven't taken action for over a decade now.
B) On the API front, check with GetStat and AuthorityLabs. I believe they both have rank data offerings from an API standpoint. Moz doesn't offer that and probably won't in the future, either.
Cheers!
-
RE: How to Get Links as A Web Host
I LOVE this comment Corey. I think this is a great, succinct way of describing the disconnect between what experienced marketers have learned about having success in inbound channels (SEO, social, community, content, etc) and what businesses often think about marketing on the web (I just want people to buy).
The truth is that over time, it will be increasingly hard to have any success in these worlds unless you're willing to commit to inbound as a strategy and that means content in some form or another. Without content, there's no opportunity for SEO (unless it's black hat SEO), no opportunity for social media growth (unless it's merely buying ads), no opportunity to take the power content has to earn trust, likability and familiarity and convert those exposed to your brand into potential customers and potential evangelists.
Id gave a couple presentations recently that might be helpful to explain this more deeply: http://hackersandfounders.tv/RDmt/rand-fishkin-inbound-marketing-for-startups/ and http://www.edsocialmedia.com/2012/05/rand-fishkin-connecting-inbound-marketing-for-exceptional-return/
Hope those help.
Unfortunately, it's my belief and experience that one cannot simply "sell your product" without marketing, and if you want to compete in inbound (non-paid) channels, content is a requirement and great content is the competitive advantage.
-
RE: Advanced: SEO best practice for a large forum to minimise risk...?
Hi Seomvi - yes, definitely a challenging problem, especially since you're thinking preventative rather than reactive (which is very wise!).
My advice would be to consider creating some form of threshold for forum content before you expose it to Google. For example, you could have a litmus test that says, if a forum thread has <500 words or fewer than 2 unique replies, apply a META NAME="ROBOTS" CONTENT="NOINDEX, FOLLOW" to the page header. In that fashion, you keep algorithms like Panda from perceiving your forum as having lots of thin content/low value pages.
For PR flow and crawl budget, I'd generally worry less. Google's gotten very adept at identifying forums, crawling them effectively, and understanding how to handle that type of content/link structure. That said, you might try using rel=prev/next to help with Google's crawling.
Wish you all the best!
-
RE: Domain authority get down significantly. Internal MOZ Issue? Google Algoritm Update?
Hi Dmitri and Juan - perhaps this way of explaining it will help:
A good analogy might be how rankings work for countries in various categories. For example, if Japan is ranked as having the world's best healthcare in 2015, and they improve the quality of their healthcare in 2016, are they guaranteed to still be #1?
Not necessarily.
Maybe the #2 ranked country improved even more and now Japan has fallen from #1 to #2 despite actually improving on their healthcare quality. Maybe countries 2-10 all improved dramatically and Japan's now fallen to #11 even though they technically got better, not worse.
PA and DA work in a similar fashion. Since they're scaled on a 100-point system, after each update, the recalculations mean that PA/DA for a given page/site could go down even if that page/site has improved their link quantity and quality. Such is the nature of a relative, scaled system. This is why I encourage folks strongly to watch not just PA/DA for their own pages/sites, but for a variety of competitors and sites in similar niches to see whether you're losing or gaining ground broadly in your field.
The score system has to be relative (we can't use absolutes or we wouldn't be able to have good correlations against Google - we'd just have another system that counts links or counts linking domains or the like). If PA/DA aren't working well for you as metrics, I'd encourage you to use something else - link counts or counts of linking domains/IPs or the like. The purpose of the DA/PA metrics is to track against Google, and over time, you might see lots of fluctuation up or down that doesn't necessarily mean you're doing better/worse. That's why the comparison process and understanding what the metrics do and why they're different than raw counts is important.
Hope that's helpful!
-
RE: Best SEO Link Building Company (textlinkbrokers.com or thinkbigsites.com) ?
Hey gang - sorry for my delay. As Keri noted; been a tough few days for me on the personal/family front.
Re: link building. My understanding is that Jarrod/TLB offer both the classic paid link services (at least I think they still do, not sure), which I'd be generally against and more white-hat style link outreach/content-marketing type link building, which I'd generally support. The former can be effective short-term, but can have dangerous results in the long run (Penguin's only the most recent example). The latter is usually of strong, lasting value, not just as a way to boost SEO, but as a way to help many inbound channels (referring link traffic, social media, branding, PR, etc).
Cheers,
-
RE: Does duplicate content not concern Rand?
Hi Stephen - when it comes to blogs, especially Wordpress blogs with paginated categories, Google's gotten plenty good over the years at knowing that the full post is the correct version. The category pages on moz.com/rand don't show the full content of the post, don't earn the same links, and do link to the individual posts, so it's really not a concern to noindex them (and, in fact, it might prevent crawling/indexation that I want Google to be able to do).
e.g. I want Google to be able to index https://moz.com/rand/category/archives/startups/ and https://moz.com/rand/mixergy-interview-startup-marketing-reaching-early-adopters-burnout-more/ even though the category page has a small snippet from the Mixergy post.
In these cases of cases, the right pages are ranking for the right queries, and Google's doing a good job of recognizing and differentiating categories vs. posts.
Hope that helps!