Hi Stuart - in the search bar, on the right-hand side, there's a drop down to change the search engine. Unfortunately, volume data is for US only right now, but we're looking into expanding that over time.
Posts made by randfish
-
RE: Keyword Explorer is Now Live; Ask Me Anything About It!
-
RE: Keyword Explorer is Now Live; Ask Me Anything About It!
If you're signing up for a new Moz Pro account at $99/month, it's only 5 queries/day. But, if you had a $99/month account before today, we've granted legacy access at 300 queries/month. At $149/month (or the $600/year standalone), you get 5,000 queries/month, and access goes up from there.
-
RE: Keyword Explorer is Now Live; Ask Me Anything About It!
Thanks Dmytro! If there's any features or functionality you'd love to see added, let us know
-
Keyword Explorer is Now Live; Ask Me Anything About It!
Howdy gang - as you probably saw, we launched our biggest new tool in Pro in many years today: https://moz.com/explorer
If you're a Moz Pro subscriber, you've already got access. We went ahead and gave folks who were at $99/month before today 300 queries/month. If you're signing up new, $99/month doesn't have KW Explorer access, but the other levels - at $149/month and above, do (5,000+ queries/month).
You can read the blog post here for lots of details, but if you have questions or product suggestions, please don't hesitate to ask!
-
RE: OSE seems to be less and less accurate
Radi - I hear you, dude. OSE's backend index, Mozscape, hasn't kept up as well as I'd like with being able to process as much of the web's links and pages that we need to get a comprehensive picture. I think that's why a lot of folks also use Ahrefs and Majestic to bolster their link data. Mozscape is good at a few things those others aren't - Page Authority and Domain Authority are quite good (very well correlated with Google's rankings), Spam Score is a very useful metric, and OSE has a lot of metrics like MozRank and MozTrust that you can't get elsewhere.
But, the downside is that calculating those metrics is a hugely computationally expensive process. Every few weeks, Mozscape takes all the links and URLs it's seen the last 90 days and starts a processing run that takes 20-30 days. When it finishes, we get a new index. We can't overfill that system with too much data, or it breaks What we need is more resilient, real-time system that can scale with hardware, and that's what the team's been working on.
In the months ahead, you should see Mozscape improve (particularly for US and English-language links and websites), and around the end of the year or beginning of 2017, we expect to have a system that can process large numbers of links in real time.
Apologies that our tech hasn't kept up with the needs on this. Despite a huge investment over many years, we've not yet been able to launch what's needed, but we're relentless and we'll keep at it until we get there. We know it's too important to give up on.
-
RE: How to handle Friendly URLs together with internal filters search?
Not too big a problem to have slightly longer title. Just be aware that how they display in SERPs can affect CTR, which can affect rankings. You can use https://moz.com/blog/new-title-tag-guidelines-preview-tool to get a good view of that.
-
RE: How to handle Friendly URLs together with internal filters search?
Hi JoaoCJ - In cases like these, I don't usually sweat the URL length too much. It is OK to go over a bit -- our recommendations come from correlation analysis and testing. Observing Google's rankings, it tends to be the case that pages with fewer parameters (like 0) tend to outperform pages with more, and that shorter URLs tend to outperform longer ones. That said, it's not a hard and fast rule, more a sloping line.
As far as the filters go, I might consider using rel=canonical unless you're sure you want those pages separately indexed. If that's the case (you DO want them indexed), perhaps consider using static URLS -- even something like a number in the URL could work, e.g. /123/. For the pagination, Google's also got the rel=prev/next tags that I'd suggest employing.
Wish you all the best!
-
RE: Click Through's for ranking
Hi BStone - apart from the obvious (it's manipulation and might violate Google's TOS/webspam guidelines), there's also a bunch of other items to consider:
- Google has access to Android and Chrome user/usage data to see which clicks are real/not real
- The referrer string would be passed along from wherever you placed it (and it wouldn't come from a validated, Google.com address)
- Google can see search query data and knows when a query has actually hit their server (and all sorts of details about where/from whom). Artificial inflation of those, even in our tests, only had a very short lifespan in terms of influencing SERPs (see https://moz.com/blog/impact-of-queries-and-clicks-on-googles-rankings-whiteboard-friday and https://moz.com/rand/queries-clicks-influence-googles-results/ for more details)
Thus, IMO, this isn't a pragmatic or effective way of influencing Google's perceptions of clicks/CTR.
-
RE: DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
No, that's not correct at all. As you can see from the other folks replying in the thread, and from reading my post and responses, we're simply saying that DA is a relative measure, not an absolute one. It's like ranking websites based on their visits rather than showing their raw number of visits. You could grow your site traffic by many thousands of new visitors, and still have a lower "rank" because others grew their traffic even more. That's just how relative metrics work.
-
RE: Mozscape Index update frequency problems?
I wish I had an answer to that If I had to guess, I'd say sometime in the next 2 years, but not anytime in the next 9 months. We've stopped asking the team for a final delivery date because it's just too hard to estimate all the work required, and past estimations have been so far off target. Instead, we just try to estimate the next quarter worth of sprints and then measure how we perform against those.
It turns out, replicating a processing system like Google runs without billions in revenue is really hard
-
RE: Mozscape Index update frequency problems?
Hi Kevin - the index update should be live as of right now (probably only a few hours after you posted this message). We aim to have one index update per month, so 12 per year. We had a catastrophic failure on our early January index, so it had to be abandoned (noted here: https://moz.com/products/api/updates), but the team has been working hard to fix issues and prevent others from arising. Unfortunately, it's often the case that we encounter new/unexpected/never-before-seen issues that need to be addressed. Frustrating, but unavoidable as best we can tell. Obviously, we will continue to do our best to get these indices out on time.
As far as the future goes, it's hard to say. We have a bigger team now than we last year -- 4 folks work full time on the Mozscape index and 4 are working on the next-generation version of the index (which will update in near-real-time), and we certainly have much better monitoring and operational structures in place. But, as I noted above, it seems that the errors/issues we face are always new and unique - things we've never seen before in the 100+ index runs we've had over the last 8 years. I can tell you we're building processes to identify problems before they happen, and that we're better staffed, and that we have engineers on-call 24/7 to fix issues if they crop up, but processing full-graph metrics on a full-scale web index whose shape and composition can vary wildly means there's still uncertainty and probably always will be. Our job is to keep reducing that uncertainty and finding optimizations, while we rebuild the full system in the background to eventually replace the old, batch-processing system that's at the core of so many of our challenges.
Hope that helps.
-
RE: Old URLs Appearing in SERPs
In my experience, the best way to absolutely get rid of them is to use the 410 permanently gone status code, then resubmit them for indexation (possibly via an XML sitemap submission, and you can also use Google's crawl testing tool in Search Console to double-check). That said, even with 410, Google can take their time.
The other option is to recreate 200 pages there and use the meta robots noindex tag on the page to specifically exclude them. The temporary block in Google Search Console can work, too, but, it's temporary and I can't say whether it will actually extend the time that the redirected pages appear in the index via the site: command.
All that said, if the pages only show via a site: command, there's almost no chance anyone will see them
-
RE: Old URLs Appearing in SERPs
14 months! Wow. That is a long time indeed. Although, now that I look, Moz redirected OpenSiteExplorer just about a year ago, and we still have URLs showing for the site: command in Google too (https://www.google.com/search?q=site%3Aopensiteexplorer.org) so I suppose it's not that uncommon.
Glad to hear traffic and rankings are solid. Let us know if we can help out in the future!
-
RE: Old URLs Appearing in SERPs
Oh gosh - it's my pleasure! Thanks for being part of the Moz community I'm honored to help out.
As for the URLs - looks like everything's fine. Google often maintains old URLs in a searchable index form long after they've been 301'd, but for every query I tried, they're clearly pulling up the correct/new version of the page, so those redirects seem to be working just great. You're simply seeing the vestigal remnants of them still in Google (which isn't unusual - we had URLs from seomoz.org findable via site: queries for many months after moving to Moz, but the right, new pages were all ranking for normal queries and traffic wasn't being hurt).
Some examples:
- https://www.google.com/search?q=Enter+the+World+of+Eichler+Design
- https://www.google.com/search?q=Eichler+History+flashbacks
- https://www.google.com/search?q=eichler+resources+on+the+web+books
Unless you're also seeing a loss in search traffic/rankings, I wouldn't sweat it much. They'll disappear eventually from the site: query, too. It just takes a while.
-
RE: Old URLs Appearing in SERPs
Hi Rosemary - can you share some examples of the URLs and the queries that bring them up in search results? If so, we can likely do a diagnosis of what might be going on with Google and why the pages aren't correctly showing the redirected-to URLs.
-
RE: Moz's official stance on Subdomain vs Subfolder - does it need updating?
Yeah - my blog - moz.com/rand has the Wordpress install on Moz's site, on the same servers as the rest of our domain, and we've implemented security protocols that make it very hardened. There's lot of WP security stuff out there that can help, and a talented sec-engineering team should be able to set it up with a minimum of problems. Many of the world's biggest companies run Wordpress, so there's lots of pre-existing protocols.
-
RE: The Great Subdomain vs. Subfolder Debate, what is the best answer?
Hi Rosemary - thankfully, I have data, not just opinions to back up my arguments:
- In 2014, Moz moved our Beginner's Guide to SEO from guides.moz.com to moz.com itself. Rankings rose immediately, with no other changes. We ranked higher not only for "seo guide" (outranking Google themselves) but also for "beginners guide" a very broad phrase.
- Check out https://iwantmyname.com/blog/2015/01/seo-penalties-of-moving-our-blog-to-a-subdomain.html - goes into very clear detail about how what Google says about subdomains doesn't match up with realities
- Check out some additional great comments in this thread, including a number from site owners who moved away from subdomains and saw ranking benefits, or who moved to them and saw ranking losses: https://inbound.org/discuss/it-s-2014-what-s-the-latest-thinking-on-sub-domains-vs-sub-directories
- There's another good thread (with some more examples) here: https://inbound.org/blog/the-sub-domain-vs-sub-directory-seo-debate-explained-in-one-flow-chart
Ultimately, it's up to you. I understand that Google's representatives have the authority of working at Google going for them, but I also believe they're wrong. It could be that there's no specific element that penalized subdomains and maybe they're viewed the same in Google's thinking, but there are real ways in which subdomains inherit authority that stay unique to those subdomains and it IS NOT passed between multiple subdomains evenly or equally. I have no horse in this race other than to want to help you and other site owners from struggling against rankings losses - and we've just seen too many when moving to a subdomain and too many gains moving to a subfolder not to be wary.
-
RE: Domain authority get down significantly. Internal MOZ Issue? Google Algoritm Update?
Hi Juan - If I had to guess, I'd say we probably had some issues crawling the sites where your links were (most likely errors on our end, though it's possible the sites themselves had issues, too) over the summer, and we've now rectified it. We're continuing to improve with future indices, and you should see us getting more consistent about this with each release.
-
RE: Domain authority get down significantly. Internal MOZ Issue? Google Algoritm Update?
Hi Dmitri and Juan - perhaps this way of explaining it will help:
A good analogy might be how rankings work for countries in various categories. For example, if Japan is ranked as having the world's best healthcare in 2015, and they improve the quality of their healthcare in 2016, are they guaranteed to still be #1?
Not necessarily.
Maybe the #2 ranked country improved even more and now Japan has fallen from #1 to #2 despite actually improving on their healthcare quality. Maybe countries 2-10 all improved dramatically and Japan's now fallen to #11 even though they technically got better, not worse.
PA and DA work in a similar fashion. Since they're scaled on a 100-point system, after each update, the recalculations mean that PA/DA for a given page/site could go down even if that page/site has improved their link quantity and quality. Such is the nature of a relative, scaled system. This is why I encourage folks strongly to watch not just PA/DA for their own pages/sites, but for a variety of competitors and sites in similar niches to see whether you're losing or gaining ground broadly in your field.
The score system has to be relative (we can't use absolutes or we wouldn't be able to have good correlations against Google - we'd just have another system that counts links or counts linking domains or the like). If PA/DA aren't working well for you as metrics, I'd encourage you to use something else - link counts or counts of linking domains/IPs or the like. The purpose of the DA/PA metrics is to track against Google, and over time, you might see lots of fluctuation up or down that doesn't necessarily mean you're doing better/worse. That's why the comparison process and understanding what the metrics do and why they're different than raw counts is important.
Hope that's helpful!
-
Just Discovered Links Are Down for ~2-3 Days
UPDATE 12/01/15: This issue is now resolved and Just Discovered Link data is back.
Sadly, I've got bad news. Due to an error we made (literally a typo on a curl command), Just Discovered Links in Open Site Explorer will be unavailable for 2-3 days. The data was actually lost, but we're re-indexing all those links now and they should be back in working order by Monday.
NOTE: this only affects link counts and links listed in the "Just Discovered" sections of OSE. No links in the main index nor any metrics (PA/DA/etc) are affected.
Our sincere apologies - we're going to build mechanisms to prevent this for the future.
-
RE: Moz's official stance on Subdomain vs Subfolder - does it need updating?
If you don't mind the loss of ranking signals between the subdomains/domains, and are simply seeking to dominate the search results through owning multiple positions, separate domains are the best way to go. Subdomains can work for this, too, but are less consistently treated as separate sites, and Google may change that in the future.
-
RE: DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
Thanks for the feedback Joseph - I appreciate your transparency and can totally empathize with the frustration.
I think the key here, unfortunately, is in understanding and effectively explaining how the metrics of DA and PA operate and why they're not like standard counts that always go up as things get better. Clearly, we need to do a better job of that.
A good metaphor might be how rankings work for countries in various categories. For example, if Japan is ranked as having the world's best healthcare in 2015, and they improve the quality of their healthcare in 2016, are they guaranteed to still be #1?
Not necessarily.
Maybe the #2 ranked country improved even more and now Japan has fallen from #1 to #2 despite actually improving on their healthcare quality. Maybe countries 2-10 all improved dramatically and Japan's now fallen to #11 even though they technically got better, not worse.
PA and DA work in a similar fashion. Since they're scaled on a 100-point system, after each update, the recalculations mean that PA/DA for a given page/site could go down even if that page/site has improved their link quantity and quality. Such is the nature of a relative, scaled system. This is why I encourage folks strongly to watch not just PA/DA for their own pages/sites, but for a variety of competitors and sites in similar niches to see whether you're losing or gaining ground broadly in your field.
Hope that's helpful and wish you all the best.
-
RE: DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
Hi Joe - yes, it's most likely the fluctuations are because you're in that lower range. Remember that Moz can't see links you've disavowed in Google Webmaster Tools/Search Console, so we wouldn't be lowering DA/PA based on those (although, it's possible that, over time, as Google stopped counting links like that, our algorithm for PA/DA would "learn" and evolve, but that would take a while).
-
RE: DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
Hi Donna - yes, that fluctuation should be much larger on average in the tail of the web (sites with DA 0-40) than in the middle or head. This makes sense because with a relative metric, all of the factors I describe above are going to be magnified in the tail, particularly because Google's rankings change so much there and because just a few links can have such a huge impact. For metrics-savvy clients, they should be best poised to understand that since DA/PA are exponential, a few links here or there and a few valuation shifts on those links can have big swings in the 10-40 point ranges of DA/PA, whereas in the 50/60+ ranges, small shifts in link discovery or in link valuation (from us or Google) won't have as much change.
As far as a ceiling - no, we don't have a recommendation there. The idea is that as DA/PA fluctuate, especially as they get more accurate in predicting rankings (correlations & coverage), the fluctuations are generally happening because Google's changing and we're getting closer to tracking how and where (with exceptions I noted above around issues with our crawl/indexing). My biggest recommendation is to keep track of similar-sized competitors (and larger/smaller ones) so you've got a set of benchmarks for comparison.
Thanks for the note and apologies for the frustrations this causes.
-
RE: DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
Always happy to help, and especially to provide transparency. Thanks for the kind response
-
RE: DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
Hi Deacyde - that shouldn't be the message you're taking away from this post at all! As I noted above, you could improve the SEO, improve your link profile, and still see a reduction in Domain Authority as as a score due to how DA is done (on a relative scale, not an absolute scale). We could find more and better links to your site, and you'd still see a lower DA.
If a number of smaller sites in your field have all seen lower DA score in this index, that's indicative that it's nothing you've done, but rather, an indication that DA has been shifting across the board. If you're seeing rankings stay high and organic search traffic stay high, then there's nothing to worry about, and DA should still work just as well as a relative metric across sites (actually, slightly better given the improved correlations).
-
DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
Howdy folks,
Every time we do an index update here at Moz, we get a tremendous number of questions about Domain Authority (DA) and Page Authority (PA) scores fluctuating. Typically, each index (which release approximately monthly), many billions of sites will see their scores go up, while others will go down. If your score has gone up or down, there are many potential influencing factors:
- You've earned relatively more or less links over the course of the last 30-90 days.
Remember that, because Mozscape indices take 3-4 weeks to process, the data collected in an index is between ~21-90 days old. Even on the day of release, the newest link data you'll see was crawled ~21 days ago, and can go as far back as 90 days (the oldest crawlsets we include in processing). If you've done very recent link growth (or shrinkage) that won't be seen by our index until we've crawled and processed the next index. - You've earned more links, but the highest authority sites have grown their link profile even more
Since Domain and Page Authority are on a 100-page scale, the very top of that represents the most link-rich sites and pages, and nearly every index, it's harder and harder to get these high scores and sites, on average, that aren't growing their link profiles substantively will see PA/DA drops. This is because of the scaling process - if Facebook.com (currently with a DA of 100) grows its link profile massively, that becomes the new DA 100, and it will be harder for other sites that aren't growing quality links as fast to get from 99 to 100 or even from 89 to 90. This is true across the scale of DA/PA, and makes it critical to measure a site's DA and a page's PA against the competition, not just trended against itself. You could earn loads of great links, and still see a DA drop due to these scaling types of features. Always compare against similar sites and pages to get the best sense of relative performance, since DA/PA are relative, not absolute scores. - The links you've earned are from places that we haven't seen correlate well with higher Google rankings
PA/DA are created using a machine-learning algorithm whose training set is search results in Google. Over time, as Google gets pickier about which types of links it counts, and as Mozscape picks up on those changes, PA/DA scores will change to reflect it. Thus, lots of low quality links or links from domains that don't seem to influence Google's rankings are likely to not have a positive effect on PA/DA. On the flip side, you could do no link growth whatsoever and see rising PA/DA scores if the links from the sites/pages you already have appear to be growing in importance in influencing Google's rankings. - We've done a better or worse job crawling sites/pages that have links to you (or don't)
Moz is constantly working to improve the shape of our index - choosing which pages to crawl and which to ignore. Our goal is to build the most "Google-shaped" index we can, representative of what Google keeps in their main index and counts as valuable/important links that influence rankings. We make tweaks aimed at this goal each index cycle, but not always perfectly (you can see that in 2015, we crawled a ton more domains, but found that many of those were, in fact, low quality and not valuable, thus we stopped). Moz's crawlers can crawl the web extremely fast and efficiently, but our processing time prevents us from building as large an index as we'd like and as large as our competitors (you will see more links represented in both Ahrefs and Majestic, two competitors to Mozscape that I recommend). Moz calculates valuable metrics that these others do not (like PA/DA, MozRank, MozTrust, Spam Score, etc), but these metrics require hundreds of hours of processing and that time scales linearly with the size of the index, which means we have to stay smaller in order to calculate them. Long term, we are building a new indexing system that can process in real time and scale much larger, but this is a massive undertaking and is still a long time away. In the meantime, as our crawl shape changes to imitate Google, we may miss links that point to a site or page, and/or overindex a section of the web that points to sites/pages, causing fluctuations in link metrics. If you'd like to insure that a URL will be crawled, you can visit that page with the Mozbar or search for it in OSE, and during the next index cycle (or, possibly 2 index cycles depending on where we are in the process), we'll crawl that page and include it. We've found this does not bias our index since these requests represent tiny fractions of a percent of the overall index (<0.1% in total).
My strongest suggestion if you ever have the concern/question "Why did my PA/DA drop?!" is to always compare against a set of competing sites/pages. If most of your competitors fell as well, it's more likely related to relative scaling or crawl biasing issues, not to anything you've done. Remember that DA/PA are relative metrics, not absolute! That means you can be improving links and rankings and STILL see a falling DA score, but, due to how DA is scaled, the score in aggregate may be better predictive of Google's rankings.
You can also pay attention to our coverage of Google metrics, which we report with each index, and to our correlations with rankings metrics. If these fall, it means Mozscape has gotten less Google-shaped and less representative of what influences rankings. If they rise, it means Mozscape has gotten better. Obviously, our goal is to consistently improve, but we can't be sure that every variation we attempt will have universally positive impacts until we measure them.
Thanks for reading through, and if you have any questions, please leave them for us below. I'll do my best to follow up quickly.
- You've earned relatively more or less links over the course of the last 30-90 days.
-
RE: 10/14 Mozscape Index Update Details
Hi Joseph - yes, I can answer that. We took ~14 days to process this latest index, which is very good news. However, we are having some trouble with the uploading process again - our technical operations team is working with the big data team to try and uncover the source of these problems. If we can get it fixed and working (in the past, the upload step took ~12 hours, now it's taking us 3-4 days), we should have much more regular index releases.
Right now, we are feeling confident about Nov. 17th, and once we complete the upload we'll have a good picture about data quality and whether we might be able to release early (which we think is quite possible IF quality looks good and these upload issues get sorted).
-
RE: September's Mozscape Update Broke; We're Building a New Index
I hope we might actually have that 11/17 index out a little bit early. We've made a lot of fixes and optimizations, and, fingers crossed, it looks (so far) like it's making a difference in terms of speed to index processing completion.
-
RE: 10/14 Mozscape Index Update Details
That doesn't surprise me - Majestic has a larger index than Moz (theirs is actually the largest among active 3rd party indices, then Ahrefs, then us).
https://moz.com/blog/big-data-big-problems-link-indexes-compared this is a pretty good resource comparing the strengths and weaknesses of the various indices, and https://builtvisible.com/comparing-link-data-tools/ is also a good, third-party review of the three. There are strengths and weaknesses to each, but if raw link coverage is your goal, I recommend Majestic.
-
RE: 10/14 Mozscape Index Update Details
There are a variety of reasons that include:
- This index is somewhat smaller in total links crawled and URLs included
- We may have biased the crawlers towards sites/pages that are less likely to feature links from your site (this is particularly possible if the linking sites were on Chinese, Palau, or several other TLD extensions that we had previously over-indexed)
- The links we previously crawled may have been on relatively low MozRank pages that this index didn't crawl b/c we found fewer links to them (and thus lower MozRank - we tend to crawl in roughly descending MozRank order across the web).
As noted, the next index should see better coverage, fresher data, and better metrics, too. Please let us know if the problem persists and maybe we can compare your WM Tools links vs. our index to see what could be happening. Thanks and apologies.
-
RE: 10/14 Mozscape Index Update Details
We are both in the same boat there. I, too, desperately need this next index to get us on track and provide excellent value. If not, I think we're going to lose a lot of customers, and I'm not sure people will trust us for a long time on link data.
You have my deep and sincere apologies for the frustration and professional challenge Moz has caused. We have an obligation to do better, and I damn sure hope the team is up to delivering on that obligation.
-
RE: 10/14 Mozscape Index Update Details
Hi Joseph - you'll get no arguments from me on any of these fronts. I think if you've been using Moz exclusively or primarily for the link data component, you should request a refund by emailing help@moz.com (they'll be happy to provide one). Totally concur that our service the past 60 days on the link data front has not been acceptable.
-
RE: 10/14 Mozscape Index Update Details
Hi Donna - you are most certainly not alone in your frustration. I would call my own feelings bordering on desperation. I'm frustrated, angry, nervous, guilty, and overwhelmed with a sense of powerlessness. It seems that every time we think we've identified a problem at the root of our Mozscape issues, things just get worse and new problems we never imagined arise.
On the padding issue, I have good news and depressing news. The good news is that we pad every estimate by nearly 2X. In a normal, problem-free index cycle, we can get it done in 12-14 days.... And yet, we never estimate less than 30-31 days for an index release. In the early part of this year, you might recall that we had a number of indices released back to back in that 2-3 week window. Then things took a turn for the worse and we've been struggling ever since.
I want to be honest - my belief is that we are going to get better, but the evidence of the last 6 months is against me. I want to believe my team and I know they are trying hard and doing everything they can to get this fixed. However, I think it's wise to have skepticism given the trajectory of the recent past.
Hope that's helpful and thank you for the comment.
-
10/14 Mozscape Index Update Details
Howdy gang,
As you might have seen, we've finally been able to update the Mozscape index after many challenging technical problems in the last 40 days. However, this index has some unique qualities (most of them not ideal) that I should describe.
First, this index still contains data crawled up to 100 days ago. We try to make sure that what we've crawled recently is stuff that we believe has been updated/changed, but there may be sites and pages that have changed significantly in that period that we didn't update (due to issues I've described here previously with our crawlers & schedulers).
Second, many PA/DA and other metric scores will look very similar to the last index because we lost and had problems with some metrics in processing (and believe that much of what we calculated may have been erroneous). We're using metrics from the prior index (which had good correlations with Google, etc) until we can feel confident that the new ones we're calculating are correct. That should be finished by the next index, which, also, should be out much faster than this one (more on that below). Long story short on this one - if your link counts went up and you're seeing much better/new links pointing to you, but DA/PA remain unchanged, don't panic - that's due to problems on our end with calculations and will be remedied in the next index.
Third - the good news is that we've found and fixed a vast array of issues (many of them hiding behind false problems we thought we had), and we now believe we'll be able to ship the next index with greater quality, greater speed, and better coverage. One thing we're now doing is taking every URL we've ever seen in Google's SERPs (via all our rank tracking, SERPscape, the corpus for the upcoming KW Explorer product, etc) and prioritizing them in Mozscape's crawl, so we expect to be matching what Google sees a bit more closely in future indices.
My apologies for the delay in getting this post up - I was on a plane to London for Searchlove - should have got it up before I left.
-
RE: Should UK websites expect any benefit from using Google+?
Hi Mick - I'd say that if you're not finding value from Google+, I wouldn't worry about it. Google is investing less in it and unless you're in a community/niche where G+ is heavily used (travel, photography, online marketing are a few), probably best to put your effort elsewhere. The only real SEO benefit to G+ left is the personalization of SERPs, but that will only work if you have a large group following you who can be biased in their SERPs by their connection with you. Don't think it's geographically specific, but it's certainly niche-specific.
Best of luck!
-
RE: September's Mozscape Update Broke; We're Building a New Index
Sometimes yes. Sometimes, we don't know until we reach the last stages of processing whether it's going to finish or take longer. We're trying to get better at benchmarking along the way, too, and I'll talk to the team about what we can do to improve our metrics as an index run is compiling.
-
RE: September's Mozscape Update Broke; We're Building a New Index
It didn't break, but it is taking longer to process than we hoped. Very frustrating, but we have a plan that, starting in a few more weeks, should get us to much more consistent index releases (and better quality ones, too).
-
RE: Own Domains shown as Spam Links in Open Site Explorer
Hi Marc - if you're worried about them being potentially problematic in Google, just disavow them vai Google Search Console (aka Webmaster Tools), then you can point the domains to any page or part of your site you want without concern. It's likely not a big issue regardless, but if you want to be sure, that would be how I'd do it.
-
RE: September's Mozscape Update Broke; We're Building a New Index
Hi Joe - fair question.
The basic story is - what the other link indices do (Ahrefs and Majestic) is unprocessed link crawling and serving. That's hard, but not really a problem for us. We do it fairly easily inside the "Just Discovered Links" tab. The problem is really with our metrics, which is what makes us unique and, IMO, uniquely useful.
But, metrics like MozRank, MozTrust, Spam Score, Page Authority, Domain Authority, etc. require processing - meaning all the links needed to be loaded into a series of high-powered machines and iterated on, ala the PageRank patent paper (although there are obviously other kinds of ways we do this for other kinds of metrics). Therein lies the rub. It's really, really hard to do this - takes lots of smart computer science folks, requires tons of powerful machines, takes a LONG time (17 days+ of processing at minimum to get all our metrics into API-shippable format). And, in the case where things break, what's worse is that it's very hard to stop and restart without losing work and very hard to check our work by looking at how processing is going while it's running.
This has been the weakness and big challenge of Mozscape the last few years, and why we've been trying to build a new, realtime version of the index that can process these metrics through newer, more sophisticated, predictive systems. It's been a huge struggle for us, but we're doing our best to improve and get back to a consistent, good place while we finish that new version.
tl;dr Moz's index isn't like others due to our metrics, which take lots of weird/different types of work, hence buying/partnering w/ other indices wouldn't make much sense at the moment.
-
RE: September's Mozscape Update Broke; We're Building a New Index
Two potential solutions for you - 1) watch "Just Discovered Links" in Open Site Explorer - that tab will still be showing all the links we find, just without the metrics. And 2) Check out Fresh Web Explorer - it will only show you links from blogs, news sites, and other things that have feeds, but it's one of the sources I pay attention to most, and you can set up good alerts, too.
-
RE: September's Mozscape Update Broke; We're Building a New Index
Yeah - the new links you see via "just discovered" will take longer to be in the main index and impact metrics like MozRank, Page Authority, Domain Authority, etc. It's not that they're not picked up or not searched, but that they don't yet impact the metrics.
And yes - will check out the other question now!
-
RE: September's Mozscape Update Broke; We're Building a New Index
Hi Will - that's not entirely how I'd frame it. Mozscape's metrics will slowly, over time, degrade in their ability to predict rankings, but it's not as though exactly 31 days after the last update, all the metrics or data is useless. We've had delays before of 60-90+ days (embarrassing I know) and the metrics and link data still applied in those instances, though correlations did slowly get worse.
The best way I can put it is - our index's data won't be as good as it normally is for the next 20-30 days, though it's better now than it will be in 10 days and was better 10 days ago than it is today. It's a gradual decline as the web's link structure changes shape and as new site and pages come into Google's index that we don't account for.
-
September's Mozscape Update Broke; We're Building a New Index
Hey gang,
I hate to write to you all again with more bad news, but such is life. Our big data team produced an index this week but, upon analysis, found that our crawlers had encountered a massive number of non-200 URLs, which meant this index was not only smaller, but also weirdly biased. PA and DA scores were way off, coverage of the right URLs went haywire, and our metrics that we use to gauge quality told us this index simply was not good enough to launch. Thus, we're in the process of rebuilding an index as fast as possible, but this takes, at minimum 19-20 days, and may take as long as 30 days.
This sucks. There's no excuse. We need to do better and we owe all of you and all of the folks who use Mozscape better, more reliable updates. I'm embarassed and so is the team. We all want to deliver the best product, but continue to find problems we didn't account for, and have to go back and build systems in our software to look for them.
In the spirit of transparency (not as an excuse), the problem appears to be a large number of new subdomains that found their way into our crawlers and exposed us to issues fetching robots.txt files that timed out and stalled our crawlers. In addition, some new portions of the link graph we crawled exposed us to websites/pages that we need to find ways to exclude, as these abuse our metrics for prioritizing crawls (aka PageRank, much like Google, but they're obviously much more sophisticated and experienced with this) and bias us to junky stuff which keeps us from getting to the good stuff we need.
We have dozens of ideas to fix this, and we've managed to fix problems like this in the past (prior issues like .cn domains overwhelming our index, link wheels and webspam holes, etc plagued us and have been addressed, but every couple indices it seems we face a new challenge like this). Our biggest issue is one of monitoring and processing times. We don't see what's in a web index until it's finished processing, which means we don't know if we're building a good index until it's done. It's a lot of work to re-build the processing system so there can be visibility at checkpoints, but that appears to be necessary right now. Unfortunately, it takes time away from building the new, realtime version of our index (which is what we really want to finish and launch!). Such is the frustration of trying to tweak an old system while simultaneously working on a new, better one. Tradeoffs have to be made.
For now, we're prioritizing fixing the old Mozscape system, getting a new index out as soon as possible, and then working to improve visibility and our crawl rules.
I'm happy to answer any and all questions, and you have my deep, regretful apologies for once again letting you down. We will continue to do everything in our power to improve and fix these ongoing problems.
-
RE: In lue of the canceled Moz Index update
I can't help but agree with you. Over the last few years, we've consistently had terrible delays releasing indexes, and despite a team of people way smarter and more talented than me working their tails off to get it right, we haven't had success making it work regularly yet.
This latest index cancellation is embarassing and it sucks. We produced an index, but when we looked at it, a huge problem had arisen that we couldn't see until processing was complete (another problem with our indices is our inability to get a good sense of what they'll look like until they're done which takes 20+ days of processing after a crawl). I'll detail that in a Q+A thread soon (once I get the full rundown and plan from our Big Data team) and then share around.
In any case, you have my sincere apologies and deep regrets. We'll keep trying to get this right, but just FYI - we've simultaneously been building a new index system that's more real-time (like Google's, Ahrefs, Majestic, etc) that can still calculate metrics like MozRank and Page Authority. We've made a lot of progress on it, but it's still probably 6+ months away from launching, so we'll have to deal with the old Mozscape system until then.
-
RE: Does duplicate content not concern Rand?
Hi Stephen - when it comes to blogs, especially Wordpress blogs with paginated categories, Google's gotten plenty good over the years at knowing that the full post is the correct version. The category pages on moz.com/rand don't show the full content of the post, don't earn the same links, and do link to the individual posts, so it's really not a concern to noindex them (and, in fact, it might prevent crawling/indexation that I want Google to be able to do).
e.g. I want Google to be able to index https://moz.com/rand/category/archives/startups/ and https://moz.com/rand/mixergy-interview-startup-marketing-reaching-early-adopters-burnout-more/ even though the category page has a small snippet from the Mixergy post.
In these cases of cases, the right pages are ranking for the right queries, and Google's doing a good job of recognizing and differentiating categories vs. posts.
Hope that helps!
-
RE: Is there a good website builder that can gain links?
Hi Scott - it depends. You can use Google Search Console's preferred version (https://support.google.com/webmasters/answer/44231?hl=en) to help them choose between www vs. non, but if there are other parameters or versions of the page, you really want a canonical tag or 301.
Given the limitations it sounds like GoDaddy is giving you around this stuff, I'd probably suggest moving to a different CMS/host. Better safe than sorry later.
-
RE: Is there a good website builder that can gain links?
Hi Scott - some website builders that might be worth investigation (and will almost definitely be more SEO-friendly than the experience you've described) include:
- http://www.wix.com/
- http://squarespace.com/
- https://www.drupal.org/
- https://wordpress.org/
- https://pagely.com/
I'm partial to Wordpress (and there's lots of good hosting options) because of its flexibility, but there's plenty of benefits to other platforms as well.
-
RE: Spam Score shows No Contact Info even though I have a Contact Page
Agreed - that looks off to me, too. Again, can't fix in this index, but hopefully the next one should rectify that issue.
-
RE: Spam Score shows No Contact Info even though I have a Contact Page
Hi mztobias - I think we just got that flat out wrong. Not sure why our crawler missed your contact page, but clearly it did. Hopefully in the next index, that will be rectified. I don't have the ability to manually edit the score/notation, but once we recrawl the site and update our index, it should be fixed.
Sorry about that!