Why Even Publish Dates for the Next OSE Data Set?
-
I just noticed on the Linkscape calendar the update that was supposed to happen yesterday is pushed back to the 6th.
It would be better not knowing. Is this a possibility? Or use the calendar internally, I don't care. Credibility in any announcement or claim looses muster once you do not do what you say you are going to do over and over again. Just a thought..
-
No worries, Brent. I don't think it's an attack at all. Feel free to not view the calendar.
-
lol, probably freshdesk after what they said about them on Twitter
citation:
-
Actually, a DDOS attack took Zendesk down today, it wasn't us.
-
Did you guys actually take it down now? Or is it just broken?
Oops! Google Chrome could not connect to seomoz.zendesk.com
Suggestions:
- Access a cached copy of seomoz.zendesk.com/entries/345964-linkscape-update-schedule
- Go to zendesk.**com**
- Try reloading: seomoz.zendesk.com/entries/345964-linkscape-update-schedule
-
Thanks Jen.
I'm just not sold. The only reason I have a subscription anymore is to refresh out of amusement. It's sad really.
So now I will put much less weight in the dates on the calendar. Would a 5-10 day window surrounding the "next update" date be wise?
Sorry if this seems like an attack, I realize the service has many hands in on the project and you are trying to speak on behalf of many. Trying my best not to shoot the messenger here
-
Hi Brent!
-
Hey Brent,
I totally understand where you're coming from and believe me we're as frustrated by the date slipping as our customers are. We set a date to help people plan and we like to be open and transparent about the index and when it's coming out. It may seem like a good idea to not tell people when we've missed a date, but honestly that wouldn't be very TAGFEE of us.
Also, having a date on the calendar help people remember when indexes have gone out in the past. It's most definitely not perfect, but we're working very hard to get those indexes happening much faster than they currently are. We'd rather have people know what's going on, rather than wonder.
Hope that helps explain a bit!
Thanks,
Jen
-
So we have a rough idea when to expect the next Linkscape Update. This isn't an exact date, but more of an estimate or goal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we need rel="prev" and rel="next" if we have a rel="canonical" for the first page of a series
Despite having a canonical on page 1 of a series of paginated pages for different topics, Google is indexing several, sometimes many pages in each topic. This is showing up as duplicate page title issues in Moz and Screaming Frog. Ideally Google would only index the first page in the series. Do we need to use rel="prev" etc rather than a canonical on page 1? How can we make sure Google crawls but doesn't index the rest of the series?
Moz Pro | | hjsand1 -
Moz campaign works around my robots.txt settings
My robots.txt file looks like this: User-agent: * Disallow: /*? Disallow: /search So, it should block (deindex) all dynamic URLs. If I check this url in Google: site:http://www.webdesign.org/search/page-1.html?author=47 Google tells me: A description for this result is not available because of this site's robots.txt – learn more. So far so good. Now, I ran a Moz SEO campaign and I got a bunch of duplicate page content errors. One of the links is this one: http://www.webdesign.org/search/page-1.html?author=47 (the same I tested in Google and it told me that the page is blocked by robots.txt which I want) So, it makes me think that Moz campaigns check files regardless of what robots.txt say? It’s my understanding User-agent: * should forbid Rogerbot from crawling as well. Am I missing something?
Moz Pro | | VinceWicks0 -
Ideal Campaign Set Up For Multiple Landing Pages Post Panda
I have a local floor cleaning business in the UK. I have various discrete services e.g. carpet cleaning, stone restoration and polishing (a page for each type of stone) that are delivered in distance geographical service areas. Each page is set up to target a "town/service" keyword. In each geographical location I have different competitors Should I set individual campaigns for each landing page or is one site wide campaign the way to go. Many thanks David Allen
Moz Pro | | davidallen3650 -
Is Opensiteexplorer.org missing a lot of backlink data?
I was checking a few of my clients backlinks that recently got hit by the "penguin" update to possibly try and remove some of the potentially spammy links. I ran reports in both opensiteexplorer.org and majesticseo and majesticseo brings back a ton more links, and these are sites that don't even have the max 10,000 backlinks that OSE should be bringing back. Does OSE bring back reliable backlink data? I'm starting to wonder.
Moz Pro | | RonMedlin0 -
OSE Advanced Reports best practices
I'm curious how people use the OpenSiteExplorer Advanced Reports tab. It seems very powerful. What do you use it for? In particular, I see that it has choices for 'same C block' and 'different C block'. Those seem useful to find C blocks that my competitors have links from that I do not, but I'm not totally clear on how to construct the query. Any help or best practices would be appreciated. Thanks!
Moz Pro | | scanlin0 -
What is the quickest way to get OSE data for many URLs all at once?
I have over 400 URLs in a spreadsheet and I would like to get Open Site Explorer data (domain/page authority/trust etc) for each URL. Would I use the Linkscape API to do this quickly (ie not manually entering every single site into OSE)? Or is there something in OSE or a tool I am overlooking? And whatever the best process is, can you give a brief overview? Thanks!! -Dan
Moz Pro | | evolvingSEO0 -
Anyone having issues with OSE?
Everytime i try to access the tool i get the following error: "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects."
Moz Pro | | Anest0 -
CSV sheets on new OSE not received by email
Is there still a problem with the new OSE anyone ? I am not receiving the csv reports I am ordering via OSE.
Moz Pro | | blocker04080