What happened on September 17 on Google?
-
According to mozcast:
and to my own stats, Google had a pretty strong algorithm update on September 17. Personally I have experienced a drop of about 10% of traffic coming from Google on most of my main e-commerce site virtualsheetmusic.com.
Anyone know more about that update? Any ideas about what changed?
Thank you in advance for any thoughts!
Best,
Fab.
-
Thank you jStrong, I have just posted something on that thread too. I am glad to know I am not alone! I hope we can figure out what happened and possibly tackle the problem.
Thanks!
-
Had something similar happen to a client of ours. On Sept 17 they lost about 85-90% of their organic traffic among all search engines. I mentioned this in a post I added yesterday.
http://moz.com/community/q/loss-of-85-90-of-organic-traffic-within-the-last-2-weeks
Still trying to figure out exactly what happened, but am also curious to see if anyone else ran into similar issues.
-
Thank you guys for your replies and information.
Peter: I understand what you mean and I do understand why it is not possible for MozCast to know which verticals are affected by a particular Google update. What you wrote makes perfectly sense.
Highland: you may be right about the Google Hummingbird, but I see that update was released about 1 month ago whereas I begun having a drop in traffic since September 17th (13 days ago). But we can't exclude it either since looks like my long tail keywords have been mostly affected (see below)...
It is worth noting the following events around the date the drop begun:
1. On September 15 our hosting provider had a major power outage which put our site offline for about 5 hours. I don't think Google cares too much for this since 5 hours it is not a huge down time and never happened for at least the past 3 years, but this happened just 2 days before the drop begun.
2. On September 17 (the day the traffic drop begun), we updated our website page rendering engine to improve our page speed of about 20% (this should be a good thing right?)
Also, I have analyzed the traffic coming to our website from Google and looks like that the most affected section has been our product pages which makes me think that long tail keywords have been mostly affected.
Any more clues?
Thank you again, I really appreciate your insights and thoughts on all this. And, please, if anyone has experienced a similar drop in traffic since September 17, please post it here!
-
It's a bit tricky. MozCast (and other trackers like it, to the best of my knowledge) basically look at how rankings change over time. For MozCast, we track two fixed sets (1K and 10K) of keywords every 24 hours, and then measure how the URLs in the top 10 shift. This is tricky for many reasons:
(1) There are a ton of ways to measure this "flux", all of them valid in different ways.
(2) Baseline flux is very high. I estimate that as many as 80% of queries change daily, to some degree. Google is much more dynamic and real-time than most SEOs think.
(3) "Baseline" flux varies wildly across keywords, based on factors like QDF. I wrote a post about just how extreme this can be (http://moz.com/blog/a-week-in-the-life-of-3-keywords).
Ultimately, we try to gauge to an average, and then look for extreme variations, but the noise-to-signal is extremely high. The reality is that SERPs are change all of the time, not just based on the algorithm, but on changes to sites. Google also makes more than 500 changes per year, so even "algorithm update" is a tough term to define. We're looking for the big ones.
It's important to note, too, that all of the current flux tools are focused solely on organic results and movement of those results (as are most SEOs). We're not looking at how verticals come and go, Knowledge Graph entities, etc. We're actually working on some tools to track these entities more closely. "Hummingbird" is, IMO, going to power these entities and expand them, possibly for months to come.
-
It's possible all these shifts are due to Google Hummingbird, which one person at Google called "the largest rewrite since 2001." This is the month-long rollout they've been talking about.
Still, Hummingbird is more about usability than SEO signals. The biggest shift is in "conversational search" (i.e. "How often has Rand Fishkin shaved his beard off?"). Google is now focused on returning more relevant results to those kinds fo searches. That would explain why temps never spiked. It doesn't seems to have affected generic search terms as much, if at all.
-
Thank you guys for your replies and insights. It is my understanding that MozCast draws its graph based on the number of sites affected by a Google update... is that correct? If so, I deduce that people (or the algorithm) beyond MozCast knows which sites and/or how many sites have been affected by a particular update. If that's the case, and I don't see any different scenario, I assume that we can potentially understand if those affected sites have something in common (are mostly e-commerce websites? News? etc...). That would help us to understand the nature of any update, most of all the major ones since we would have more data to crunch.
Am I wrong with my brainstorming here? I am eager to know your thoughts an this.
Thank you again.
-
Yeah, Robert's right - with 500+ updates, the task of figuring out which spikes really mean something is very difficult. The pattern of 9/17 on MozCast looks more like a traditional update, with a relatively quiet period around a one-day spike, but I don't have a lot more detail on that particular day.
The update Adam mentioned ("Hummingbird") apparently happened "about a month ago", but seems to be tied to semantic search, Knowledge Graph, etc. Google's statements are pretty vague. It's more likely that is related to the 8/20-21 spike spotted by multiple tools and webmasters than the 9/17 spike.
Sorry I can't give you more information. I've seen very little chatter or reports about 9/17, other than what we saw in the tracking data.
-
Earlier today, Google announced an algorithm change that should affect about 90% of search queries. They said this has rolled out over the past month. When more details come out and some people do some more testing, this may have something to do with it.
-
SER had a post a couple of days ago asking just that. I can't say my traffic suffered a lot (in fact, one site seems to have had a small bump in organic traffic) but given than it only got to 86 (100+ seems to indicate major shifts) I'd say it was likely a localized set that got hit. Probably a Panda shift (just a guess, tho).
-
Fabrizo,
I am intrigued whenever I see this question because it seems we notice only when we feel an effect. We handle multiple sites as an agency and I don't see any real "9/17" change across the board and no change that is noticeable for even a single site (I looked at five that I know are more likely to move).
With the mozcast, that is another area I find intriguing in that I respect those at Moz and their understanding of statistics and scientific method; I also scratch my head from time to time as to whether or not the given movement has any overall effect on "most" sites.
When you read the "About Mozcast," and they point out the numbers of algorithmic changes in a year, it is apparent that most won't have an appreciable effect on a given site. Unfortunately, for most of us change to any site we own or manage can have dire consequences so we have to always be vigilant.
I wish I could give you a better answer, good luck,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our sitemap is not indexed i Google even though it's successfully processed
Hi, Ours is a WP hosted website. We have submitted the XML sitemap with a WP plugin. It's been successfully processed by Google but it's not been indexed in and can't be found in SERP. How to get this indexed? Will there be any low crawling of sitemap as it's not indexed? Thanks
Algorithm Updates | | vtmoz0 -
URL in SERP: Google's stand
Months back, we can notice "keyword" will be bold and highlighted if its in the SERP URL. Now Google no more highlights any URLs even with exact match of keyword we search. Beside UI, Does this mean Google might devalued or reduced the importance of URL as ranking factor? We can see many search results match partially or completely in URL with search keywords.
Algorithm Updates | | vtmoz0 -
Does anyone have an idea of the benefits of Google Analytics Premium?
We've been having a discussion about the GA Premium service here in our office, trying to weigh up the pro's and con's... For the majority all it seems you gain access to is more support from google. We're trying to find out if that is the case or if you gain extra information, such as and insight into the search terms who must not be named. Of course i'm talking about the (Not Set) data... This section of data is ever increasing, yes i know we can access certain terms through webmasters but it was so much easier (in the good ol' days) when all the data was under one roof! Any thought opinions or even more questions would be greatly appreciated, i look forward to your responses. Anthony
Algorithm Updates | | Kal-SEO0 -
Google has indexed a lot of test pages/junk from the development days.
With hind site I understand that this could have been avoided if robots.txt was configured properly. My website is www.clearvisas.com, and is indexed with both the www subdomain and with out. When I run site:clearvisas.com in Google I get 1,330 - All junk from the development days. But when I run site:www.clearvisas.com in Google I get 66 - these results all post development and more in line with what I wanted to be indexed. Will 1,330 junk pages hurt my seo? Is it possible to de-index them and should I? If the answer is yes to any of the questions how should I proceed? Kind regards, Fuad
Algorithm Updates | | Fuad_YK0 -
Shortened Title in Google Places/Local Results in SERPs
I've been doing some local SEO lately and noticed something today. When I do a search for "State/town name Cat Toys", I see the title tag of the website in the local results as opposed to the business name. I'm happy they are showing up above the normal results, but I wonder if having the brand name at the end of the site title impacts clicks. For example: Site name: New Hampshire Cat Toys and Accessories | Cats R Us But in the places results the title is cut short because they show the address, so all they see is: New Hampshire Cat Toys and.... Do you think branding is especially important in local results? Or less important? I could hear arguments for both sides. I realize the site URL is shown in green below the title, but it's not the same as having a brand in the title portion. It also looks like some of the competition has just their name show up as opposed to their website title. Is this something I can fix in Google Places, or is something Google does on its own? Cheers, Vinnie
Algorithm Updates | | vforvinnie1 -
Are you getting any action from Google +1 ?
If you have added google plus one to your website you can check on the impact by visiting your webmaster tools account. In your GWT account you will see a left menu item for "+1 Metrics". If you click on "Search Impact" you can see the CTR change attributed to +1. Anybody seeing anything there yet?
Algorithm Updates | | EGOL0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0 -
What are the differences between Google SEO and Bing SEO?
I came across this question on why the poster's rankings in Bing/Yahoo were so much lower than his rankings in Google. One of the links responded with was a presentation Rand gave about the difference in ranking elements of Google and Bing. My purpose for looking into this is to boost rankings in Bing to be more in line with my Google rankings. My takeaways from Rand's presentation were that Bing likes shorter URLs than Google and it's better to have more links from more root domains with more precise anchor text. Unfortunately this presentation was given at last year's SMX Advanced and is almost a year old. Since then Microsoft has been accused of basically scraping the Google SERPs and Google unleashed at least two maybe three rabid Pandas. Needless to say the environment has changed. So my question is for those people who are happy with how they rank in Bing: What SEO factors are you seeing make a bigger impact in Bing vs. how they impact your Google rankings?
Algorithm Updates | | rball11