Index Update Release
-
Hello. Is the index update still on course to be released on Friday?
-
Hi Trevor!
As of right now, the next index is still scheduled for release on Friday. If a change to this date occurs, you will see it reflected on the Mozscape API Updates page. If something were to happen to push back this estimated date, we would announce it on that page as soon as the delay occurred.
I hope that helps!
-
There hasn't been any announcement to suggest otherwise... yet. But who knows. If they were to re-schedule it, they may not even announce it until Friday. Or even Saturday.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Location Data Batch Updates via the MOZ API
According to the MOZ API documentation, I am able to update multiple Locations in a batch in order to create or update their location data, currently 130 locations. I have successfully created a batch and the API returned the $id, as I expected. Yet, it doesn't make clear how the multiple locations I want to update are supposed to be sent back to the API. I was expecting an upload of a CSV file or JSON data, and not Query Parameter as noted in the docs. When including a JSON file, properly formatted, as a binary upload, the response is still expecting a locations parameter. See error here: { "status":"MISSING_PARAMETER", "message":"locations missing. ", "response": { "locations":"MISSING_PARAMETER" } } https://moz.com/developers/docs/local/api#_api_batch__id-PATCH
API | | yuca.pro1 -
Sitemaps and Indexed Pages
Hi guys, I created an XML sitemap and submitted it for my client last month. Now the developer of the site has also been messing around with a few things. I've noticed on my Moz site crawl that indexed pages have dropped significantly. Before I put my foot in it, I need to figure out if submitting the sitemap has caused this.. can a sitemap reduce the pages indexed? Thanks David.
API | | Slumberjac0 -
Mozscape API Updates (Non-updates!) - becoming a joke!
This is the 3rd month in succession where the Mozscape index has been delayed. Myself and clients are losing patience with this, as I am sure many others must be. Just what do you suppose we tell clients waiting for that data? We have incomplete and sometimes skewed metrics to report on, delays which then get delayed further, with nothing but the usual 'we are working on it' and 'bear with us'. It's becoming obvious you fudged the index update back in January (see discussion here with some kind of explanation finally from Rand: https://moz.com/community/q/is-everybody-seeing-da-pa-drops-after-last-moz-api-update), and seems you have been fumbling around ever since trying to fix it, with data all over the place, shifting DA scores and missing links from campaign data. Your developers should be working around the clock to fix this, because this is a big part of what you're selling in your service, and as SEO's and marketers we are relying on that data for client retention and satisfaction. Will you refund us all if we should lose clients over this?! .. I don't think so! With reports already sent out the beginning of the month with incomplete data, I told clients the index would refresh April 10th as informed from the API updates page, only to see it fudged again on day of release with the index being rolled back to previous. So again, I have to tell clients there will be more delays, ...with the uncertainty of IF it WILL EVEN get refreshed when you say it will. It's becoming a joke.. really!
API | | GregDixson2 -
/index.php causing a few issues
Hey Mozzers, Our site uses magento. Pages within the site (not categories or products) are set to display as www.domain.co.uk/page-url/ The hta access is set to redirect all version such as www.domain.co.uk/page-url to a url ending in a / However in google analytics and in moz landing page tracker these urls are being represented by www.domain.co.uk/page-url/index.php When visiting www.domain.co.uk/page-url/index.php a 404 is displayed. I know that by default when directed to a directory it automatically finds and displays the index file. So i understand why this is happening to some degree. However, when manually visiting this link does not exist. This poses a problem when trying to view the landing pages information in moz pro. I have 20 keywords being tracked in relation to www.domain.co.uk/page-url/ but because moz is recording it as www.domain.co.uk/page-url/index.php the keywords are unrelated so not showing information in relation to the page. Any ideas?
API | | ATP0 -
Why did the April Index Raise DA?
All of our websites DA raised dramatically, including the competitors we track Any idea why this may have happened across the board?
API | | Blue_Compass0 -
March 2nd Mozscape Index Update is Live!
We are excited to announce that our March 2<sup>nd</sup> Index Update is complete and it is looking great! We grew the number of subdomains and root domains indexed, and our correlations are looking solid across the board. Run, don’t walk, to your nearest computer and check out the sweet new data! Here is a look at the finer details: 141,626,596,068 (141 billion) URLs 1,685,594,701 (1 billion) subdomains 193,444,117 (193 million) root domains 1,124,641,982,250 (1.1 Trillion) links Followed vs nofollowed links 3.09% of all links found were nofollowed 62.41% of nofollowed links are internal 37.59% are external Rel canonical: 27.46% of all pages employ the rel=canonical tag The average page has 92 links on it 74 internal links on average 18 external links on average Thanks again! PS - For any questions about DA/PA fluctuations (or non-fluctuations) check out this Q&A thread from Rand:https://moz.com/community/q/da-pa-fluctuations-how-to-interpret-apply-understand-these-ml-based-scores
API | | IanWatson7 -
September's Mozscape Update Broke; We're Building a New Index
Hey gang, I hate to write to you all again with more bad news, but such is life. Our big data team produced an index this week but, upon analysis, found that our crawlers had encountered a massive number of non-200 URLs, which meant this index was not only smaller, but also weirdly biased. PA and DA scores were way off, coverage of the right URLs went haywire, and our metrics that we use to gauge quality told us this index simply was not good enough to launch. Thus, we're in the process of rebuilding an index as fast as possible, but this takes, at minimum 19-20 days, and may take as long as 30 days. This sucks. There's no excuse. We need to do better and we owe all of you and all of the folks who use Mozscape better, more reliable updates. I'm embarassed and so is the team. We all want to deliver the best product, but continue to find problems we didn't account for, and have to go back and build systems in our software to look for them. In the spirit of transparency (not as an excuse), the problem appears to be a large number of new subdomains that found their way into our crawlers and exposed us to issues fetching robots.txt files that timed out and stalled our crawlers. In addition, some new portions of the link graph we crawled exposed us to websites/pages that we need to find ways to exclude, as these abuse our metrics for prioritizing crawls (aka PageRank, much like Google, but they're obviously much more sophisticated and experienced with this) and bias us to junky stuff which keeps us from getting to the good stuff we need. We have dozens of ideas to fix this, and we've managed to fix problems like this in the past (prior issues like .cn domains overwhelming our index, link wheels and webspam holes, etc plagued us and have been addressed, but every couple indices it seems we face a new challenge like this). Our biggest issue is one of monitoring and processing times. We don't see what's in a web index until it's finished processing, which means we don't know if we're building a good index until it's done. It's a lot of work to re-build the processing system so there can be visibility at checkpoints, but that appears to be necessary right now. Unfortunately, it takes time away from building the new, realtime version of our index (which is what we really want to finish and launch!). Such is the frustration of trying to tweak an old system while simultaneously working on a new, better one. Tradeoffs have to be made. For now, we're prioritizing fixing the old Mozscape system, getting a new index out as soon as possible, and then working to improve visibility and our crawl rules. I'm happy to answer any and all questions, and you have my deep, regretful apologies for once again letting you down. We will continue to do everything in our power to improve and fix these ongoing problems.
API | | randfish11 -
Have Questions about the Jan. 27th Mozscape Index Update? Get Answers Here!
Howdy y'all. I wanted to give a brief update (not quite worthy of a blog post, but more than would fit in a tweet) about the latest Mozscape index update. On January 27th, we released our largest web index ever, with 285 Billion unique URLs, and 1.25 Trillion links. Our previous index was also a record at 217 Billion pages, but this one is another 30% bigger. That's all good news - it means more links that you're seeking are likely to be in this index, and link counts, on average, will go up. There are two oddities about this index, however, that I should share: The first is that we broke one particular view of data - 301'ing links sorted by Page Authority doesn't work in this index, so we've defaulted to sorting 301s by Domain Authority. That should be fixed in the next index, and from our analytics, doesn't appear to be a hugely popular view, so it shouldn't affect many folks (you can always export to CSV and re-sort by PA in Excel if you need, too - note that if you have more than 10K links, OSE will only export the first 10K, so if you need more data, check out the API). The second is that we crawled a massively more diverse set of root domains than ever before. Whereas our previous index topped out at 192 million root domains, this latest one has 362 million (almost 1.9X as many unique, new domains we haven't crawled before). This means that DA and PA scores may fluctuate more than usual, as link diversity are big parts of those calculations and we've crawled a much larger swath of the deep, dark corners of the web (and non-US/non-.com domains, too). It also means that, for many of the big, more important sites on the web, we are crawling a little less deeply than we have in the past (the index grew by ~31% while the root domains grew by ~88%). Often, those deep pages on large sites do more internal than external linking, so this might not have a big impact, but it could depend on your field/niche and where your links come from. As always, my best suggestion is to make sure to compare your link data against your competition - that's a great way to see how relative changes are occurring and whether, generally speaking, you're losing or gaining ground in your field. If you have specific questions, feel free to leave them and I'll do my best to answer in a timely fashion. Thanks much! p.s. You can always find information about our index updates here.
API | | randfish8