Which Algorithm Change Hurt the Site? A causation/correlation issue
-
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally
Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released.
Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases
We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day
-
It's really tough (and even inadvisable) to try to pin a traffic change to an algorithm update based solely on spikes in a graph. On rare occasion, it's pretty clear (Penguin is a good example, I've found), but in most cases there's just a lot of gray areas and the graph leaves out a mountain of data.
The big issue I see here is potentially seasonality and knowing what happened to the site and business. For example, you can look at #6 and #7 and call these dips, but that sort of ignores the spike. Is the dip the anomaly, or is the spike the anomaly? What drove up traffic between #4 and #6? Maybe that simply stopped, was a one-time event, or was seasonal.
Why was there volatility between #7 and #14 and then relative stability after #14? You could call #14 a "drop", but not knowing the timeline, it's hard to see how the curve might smooth in different windows. What it looks like is a period of highly volatile events followed by an evening out.
Without knowing the industry, the business, the history, and without segmenting this data, trying to make claims just based on dips and spikes in the graph is pretty dangerous, IMO. This could have virtually nothing to do with the algorithm, in theory.
-
I don't understand how dates would help? Was it not clear that the red lines are the dates of algo updates?
By abstracting the data the hope was to gain insight into how to read the graphs in relation to updates, and not just get help related to specific updates which wouldn't help much the next time we have to deal with a traffic drop problem. More a question of who to think rather than what to think.
Trying to read between the lines are you saying different algo changes take different amounts of time to kick in and that's why a more detailed graph is more useful? For example if #1 was the first penguin change, would your response be different if it was the first panda change?
-
You can use the Google Penalty Checker tool from Fruition: http://fruition.net/google-penalty-checker-tool/
I would not believe 100% on the tool results, but you can at least have an initial Analise, you'll need to go deeper to double check if this initial Analise is 100% relevant or not.
- Felipe
-
This doesn't tell me anything. If you at least had dates in there you could compare traffic dips to Google Algo Updates/Refreshes.
I understand you can't reveal the domain but I will be shocked if somebody here can tell you anything without further information. This place is full of brilliant minds, but that would take some sort of a mind-reader to tackle...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will changing the property from http to https in Google Analytics affect main unfiltered view?
I set my client up with an unfiltered view in Google Analytics. This is the one with historical data going back for years, so I don't want to do anything that will affect this view. Recently, the website moved from HTTP to HTTPS. There's a setting for the property that will allow me to change the property name to https://EXAMPLE.com and change the default URL to https://EXAMPLE.com. Questions: 1. If I change the property name and the default URL, will this somehow affect my unfiltered view in a way that I'll lose historical data or data moving forward? 2. I have heard that changing the default URL to HTTPS will help me avoid a common problem others have experienced (where they lose the referrer in Google Analytics and a bunch of their sessions go to direct / other). Is this true?
Reporting & Analytics | | Kevin_P3 -
Event/Goal Tracking For EKM Powershop
Hi There, I have a client with EKM powershop. I am trying to figure out a way to either setup goal tracking (however I don't know what the final destination page is) or event tracking. I know I can do the ecommerce tag thing, but I am always worried about breaking the tracking code. Any ideas?
Reporting & Analytics | | nezona0 -
Moz Crawl shows over 100 times more pages than my site has?
The latest crawl stats are attached. My site has just over 300 pages? Wondering what I have done wrong? RRv3fR0
Reporting & Analytics | | Billboard20120 -
When to note Keyword improvement/decline?
Every report we have fluctuations in our rankings, but the majority of them are only a few places up or a few places down. Other than first page and top 3, when do you all start noting fluctuations (good or bad)? Would you start at 5, 10, 20 etc? Thanks, Ruben
Reporting & Analytics | | KempRugeLawGroup0 -
Question about cannonical URLs for a site redesign
Hello folks, I've redesigned a site completely and I ended up changing their CMS to wordpress as well. So their URLs which mostly ended in .html and folder organization have been thrown completely out the window with wordpress' '/' format. I'm just wondering what the best way is to approach retaining all the site's previous "link juice". What should I be doing here? How do I make sure their organic rankings don't fall? (They've left their previous SEO firm so they can't help me out on this). Thanks!
Reporting & Analytics | | seonubblet0 -
Is there any way to find out how many people blocked our site in Google?
With it being a factor in Panda, I would like to know what the number is and whether it's high / low compared to other sites etc. I kind of doubt a lot of people would have blocked us, but I'd like to at least be aware of whether that's a problem for us.
Reporting & Analytics | | rango0 -
Should we add the city to our keywords for a site that is only local?
This is one of those things I have done for a long time and all of a sudden asked myself was it necessary: For our local clients, we add the city name (Houston, KC, Birmingham) after each keyword. An example would be TestSite.com/big-tester-houston A Title Tag might be Big Tester Houston | Test Site, etc. Where appropriate we do the same with H1 or H2's and occasionally in the content we will use the city name. The thought being that since the site is only for a given city, it will be deemed more relevant than a site from outside.( I understand there are other factors in SEO; this is a specific question around adding the city). Yes, we also optimize with local directories/citation sites. Is this overkill, is it even worthwhile? Is there any evidence one way or another? I would love some strong opinions backed up with something other than anecdotal evidence where possible.
Reporting & Analytics | | RobertFisher0 -
Best practice SEO/SEM/Analaytics/Social reports
Hi All, does anyone have a best practice excel spreadsheet of a internal report we should be using.... ie what are the main factors we should be tracking? Unqiue views? time spent on site? Where they came from? seo/sem/network/direct to site? social media tracking? amount of +1/fb likes/tweets etc thanks
Reporting & Analytics | | Tradingpost0