What happened on September 17 on Google?
-
According to mozcast:
and to my own stats, Google had a pretty strong algorithm update on September 17. Personally I have experienced a drop of about 10% of traffic coming from Google on most of my main e-commerce site virtualsheetmusic.com.
Anyone know more about that update? Any ideas about what changed?
Thank you in advance for any thoughts!
Best,
Fab.
-
Thank you jStrong, I have just posted something on that thread too. I am glad to know I am not alone! I hope we can figure out what happened and possibly tackle the problem.
Thanks!
-
Had something similar happen to a client of ours. On Sept 17 they lost about 85-90% of their organic traffic among all search engines. I mentioned this in a post I added yesterday.
http://moz.com/community/q/loss-of-85-90-of-organic-traffic-within-the-last-2-weeks
Still trying to figure out exactly what happened, but am also curious to see if anyone else ran into similar issues.
-
Thank you guys for your replies and information.
Peter: I understand what you mean and I do understand why it is not possible for MozCast to know which verticals are affected by a particular Google update. What you wrote makes perfectly sense.
Highland: you may be right about the Google Hummingbird, but I see that update was released about 1 month ago whereas I begun having a drop in traffic since September 17th (13 days ago). But we can't exclude it either since looks like my long tail keywords have been mostly affected (see below)...
It is worth noting the following events around the date the drop begun:
1. On September 15 our hosting provider had a major power outage which put our site offline for about 5 hours. I don't think Google cares too much for this since 5 hours it is not a huge down time and never happened for at least the past 3 years, but this happened just 2 days before the drop begun.
2. On September 17 (the day the traffic drop begun), we updated our website page rendering engine to improve our page speed of about 20% (this should be a good thing right?)
Also, I have analyzed the traffic coming to our website from Google and looks like that the most affected section has been our product pages which makes me think that long tail keywords have been mostly affected.
Any more clues?
Thank you again, I really appreciate your insights and thoughts on all this. And, please, if anyone has experienced a similar drop in traffic since September 17, please post it here!
-
It's a bit tricky. MozCast (and other trackers like it, to the best of my knowledge) basically look at how rankings change over time. For MozCast, we track two fixed sets (1K and 10K) of keywords every 24 hours, and then measure how the URLs in the top 10 shift. This is tricky for many reasons:
(1) There are a ton of ways to measure this "flux", all of them valid in different ways.
(2) Baseline flux is very high. I estimate that as many as 80% of queries change daily, to some degree. Google is much more dynamic and real-time than most SEOs think.
(3) "Baseline" flux varies wildly across keywords, based on factors like QDF. I wrote a post about just how extreme this can be (http://moz.com/blog/a-week-in-the-life-of-3-keywords).
Ultimately, we try to gauge to an average, and then look for extreme variations, but the noise-to-signal is extremely high. The reality is that SERPs are change all of the time, not just based on the algorithm, but on changes to sites. Google also makes more than 500 changes per year, so even "algorithm update" is a tough term to define. We're looking for the big ones.
It's important to note, too, that all of the current flux tools are focused solely on organic results and movement of those results (as are most SEOs). We're not looking at how verticals come and go, Knowledge Graph entities, etc. We're actually working on some tools to track these entities more closely. "Hummingbird" is, IMO, going to power these entities and expand them, possibly for months to come.
-
It's possible all these shifts are due to Google Hummingbird, which one person at Google called "the largest rewrite since 2001." This is the month-long rollout they've been talking about.
Still, Hummingbird is more about usability than SEO signals. The biggest shift is in "conversational search" (i.e. "How often has Rand Fishkin shaved his beard off?"). Google is now focused on returning more relevant results to those kinds fo searches. That would explain why temps never spiked. It doesn't seems to have affected generic search terms as much, if at all.
-
Thank you guys for your replies and insights. It is my understanding that MozCast draws its graph based on the number of sites affected by a Google update... is that correct? If so, I deduce that people (or the algorithm) beyond MozCast knows which sites and/or how many sites have been affected by a particular update. If that's the case, and I don't see any different scenario, I assume that we can potentially understand if those affected sites have something in common (are mostly e-commerce websites? News? etc...). That would help us to understand the nature of any update, most of all the major ones since we would have more data to crunch.
Am I wrong with my brainstorming here? I am eager to know your thoughts an this.
Thank you again.
-
Yeah, Robert's right - with 500+ updates, the task of figuring out which spikes really mean something is very difficult. The pattern of 9/17 on MozCast looks more like a traditional update, with a relatively quiet period around a one-day spike, but I don't have a lot more detail on that particular day.
The update Adam mentioned ("Hummingbird") apparently happened "about a month ago", but seems to be tied to semantic search, Knowledge Graph, etc. Google's statements are pretty vague. It's more likely that is related to the 8/20-21 spike spotted by multiple tools and webmasters than the 9/17 spike.
Sorry I can't give you more information. I've seen very little chatter or reports about 9/17, other than what we saw in the tracking data.
-
Earlier today, Google announced an algorithm change that should affect about 90% of search queries. They said this has rolled out over the past month. When more details come out and some people do some more testing, this may have something to do with it.
-
SER had a post a couple of days ago asking just that. I can't say my traffic suffered a lot (in fact, one site seems to have had a small bump in organic traffic) but given than it only got to 86 (100+ seems to indicate major shifts) I'd say it was likely a localized set that got hit. Probably a Panda shift (just a guess, tho).
-
Fabrizo,
I am intrigued whenever I see this question because it seems we notice only when we feel an effect. We handle multiple sites as an agency and I don't see any real "9/17" change across the board and no change that is noticeable for even a single site (I looked at five that I know are more likely to move).
With the mozcast, that is another area I find intriguing in that I respect those at Moz and their understanding of statistics and scientific method; I also scratch my head from time to time as to whether or not the given movement has any overall effect on "most" sites.
When you read the "About Mozcast," and they point out the numbers of algorithmic changes in a year, it is apparent that most won't have an appreciable effect on a given site. Unfortunately, for most of us change to any site we own or manage can have dire consequences so we have to always be vigilant.
I wish I could give you a better answer, good luck,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens if we remove all the links to internal pages from our homepage?
Hi Moz community, We wanna give a try by removing all the links from homepage to internal pages and keep just a free trial button. Will this impact our SEO anyway? We have nearly 15 important internal pages at 2nd and 3rd hierarchy level. They may drop in rankings but we want to risk for few days to understand how it works. Your opinion please! Thanks
Algorithm Updates | | vtmoz0 -
Google not crawling click to expand content - suggestions?
It seems like Google confirmed this week in a G+ hangout that content in click to expand content e.g. 'read more' dropdown and tabbed content scenarios will be discounted. The suggestion was if you have content it needs to be visible on page load. Here's more on it https://www.seroundtable.com/google-index-click-to-expand-19449.html and the actual hangout, circa 11 mins in https://plus.google.com/events/cjcubhctfdmckph433d00cro9as. From a UX and usability point of view having a lot of content that was otherwise tabbed or in click to expand divs can be terrible, especially on mobile. Does anyone have workable solutions or can think of examples of really great landing pages (i'm mostly thinking ecommerce) that also has a lot of visible content? Thanks Andy
Algorithm Updates | | AndyMacLean0 -
If Google Trends Doubles?
If google shows a search trend doubling in a time frame, does that mean the amount of searches doubled? As in: 2006 was ranked at a 50 on trends and the 100 is 2013 and in 2013 10,000 searches were made, does that mean around 5,000 searches were made in 2006?
Algorithm Updates | | JoshBowers20120 -
De-indexed homepage in Google - very confusing.
A website I provide content for has just suffered a de-indexed homepage in Google (not in any of the other search engines) - all the other pages remained indexed as usual. Client asked me what might be the problem and I just couldn't figure it out - no linkbuilding has ever been carried out so clean backlink profile, etc. I just resubmitted it and it's back in its usual place, and has maintained the rankings (and PR) it had before it disappeared a few days ago. I checked WMT and no warnings or issues there. Any idea why this might've happened?
Algorithm Updates | | McTaggart0 -
Will Ranking Reports be Affected with the new Google Changes?
For example: Raven stopped use of scraped Google, SEMRush data on Jan. 2 Raven stopped offering unauthorized Google SERP rankings and keyword data (a.k.a. scraped Google data) on Jan. 2, 2013. The change included the retirement of the SERP Tracker and the elimination of SEMRush data from the Raven platform. Raven has released new SEO performance reports that make it easy to show clients the impact of campaigns to improve organic traffic. Raven will continue to upgrade reports through the year. We thank the many customers who continue their business with Raven. More details about the SEO performance reports and other recent releases are available Is SEOMoz protected in some way? Or will you have to give up rankings reports too?
Algorithm Updates | | MSWD0 -
Why am I getting different Google SERP result for same keywords?
Hi Mozzers, I have noticed recently that Google (.com.au) has been serving up different SERP results for the same keywords. For example, one of our main keywords is "Car Loan". One result will show our site as ranking #5 organically from 242,000,000 results. A refresh of this search will then result in our site not ranking at all from 133,000,000 results. We have been noticing this happen only in the last few days & more frustrating is that Google is throwing up the SERP from 133,000,000 results more frequently. Would anyone know why this is occurring? And what can we do, if anything, to ensure we are shown regardless of how many results Google calls from? Is it from recent algo update & will it settle down over time? Any help would be greatly appreciated. (Just to add - I'm not gogged in to Google when completing this test & regularly clear cookies etc so I don't believe its a personalised search issue)
Algorithm Updates | | 360Finance0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0