Which Algorithm Change Hurt the Site? A causation/correlation issue
-
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally
Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released.
Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases
We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day
-
It's really tough (and even inadvisable) to try to pin a traffic change to an algorithm update based solely on spikes in a graph. On rare occasion, it's pretty clear (Penguin is a good example, I've found), but in most cases there's just a lot of gray areas and the graph leaves out a mountain of data.
The big issue I see here is potentially seasonality and knowing what happened to the site and business. For example, you can look at #6 and #7 and call these dips, but that sort of ignores the spike. Is the dip the anomaly, or is the spike the anomaly? What drove up traffic between #4 and #6? Maybe that simply stopped, was a one-time event, or was seasonal.
Why was there volatility between #7 and #14 and then relative stability after #14? You could call #14 a "drop", but not knowing the timeline, it's hard to see how the curve might smooth in different windows. What it looks like is a period of highly volatile events followed by an evening out.
Without knowing the industry, the business, the history, and without segmenting this data, trying to make claims just based on dips and spikes in the graph is pretty dangerous, IMO. This could have virtually nothing to do with the algorithm, in theory.
-
I don't understand how dates would help? Was it not clear that the red lines are the dates of algo updates?
By abstracting the data the hope was to gain insight into how to read the graphs in relation to updates, and not just get help related to specific updates which wouldn't help much the next time we have to deal with a traffic drop problem. More a question of who to think rather than what to think.
Trying to read between the lines are you saying different algo changes take different amounts of time to kick in and that's why a more detailed graph is more useful? For example if #1 was the first penguin change, would your response be different if it was the first panda change?
-
You can use the Google Penalty Checker tool from Fruition: http://fruition.net/google-penalty-checker-tool/
I would not believe 100% on the tool results, but you can at least have an initial Analise, you'll need to go deeper to double check if this initial Analise is 100% relevant or not.
- Felipe
-
This doesn't tell me anything. If you at least had dates in there you could compare traffic dips to Google Algo Updates/Refreshes.
I understand you can't reveal the domain but I will be shocked if somebody here can tell you anything without further information. This place is full of brilliant minds, but that would take some sort of a mind-reader to tackle...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to protect the site from fake traffic
How to protect the site from fake traffic On Google Analytic there are no visits, but on the front of the site there are 6000 jetpack wordpress statistique https://arabtechnologie.com/
Reporting & Analytics | | BELGHOUL0 -
Avg Page Load Time (sec) Comppared to site average - what does it mean?
Hi All, In google analytic In Site Speed -> Page Timings we have two columns a) Page Views & b) Avg Load Time (sec) compared to site average. Now in "b" column I am able to below % one in green and another in brown so what does it mean? Can anyone please explain me? Image attached Thanks! bNbBA
Reporting & Analytics | | amu1230 -
Bounce Rates: Leaving my domain.com/blog to shop on mydomain.com counts as bounce rate?
Hello! I have a kind of difficult question. On my main domain i have: Store: mydomain.com and wordpress blog on mydomain.com/blog If I have a link to a specific product on my blog and user goes to the product on the store, will bounce rate increase or as it's the same domain will be like a new page view? Different CMS's and blog is on a different analytics account than the store. I hope i could explain myself! Thank you
Reporting & Analytics | | prozis0 -
Google Tag Manager chrome plugin to diagnose Analytics issues
Hi I've just used Google Tag Manager chrome plugin to look at possible analytics issues on a clients site and it has reported that its Analytics ID is being tracked twice. 1 is Universal and the other is Universal Asynchronous And when i click the question mark next to the 'Where to Optimise' info in GTM this page is displayed with teh following info highlighted: https://developers.google.com/analytics/devguides/collection/gajs/asyncMigrationExamples ga.js is a legacy library. If you are starting a new implementation we recommend you use the latest version of this library, analytics.js. For exisiting implementations, learn how to migrate from ga.js to analytics.js. Since both versions seem to be on there surely i dont need to migrate but just delete the old non-asynchronous version ? Or do i need do anything else or additional ? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Changing URL Parameters in Webmaster Tools
We have a bit of a conundrum. Webmaster tools is telling us that they are crawling too many URLs: Googlebot found an extremely high number of URLs on your site: http://www.uncommongoods.com/ In their list of URL examples, all of the URLs have tons of parameters. We would probably be ok telling Google not to index any of the URLs with parameters. We have a great URL structure. All of our category and product pages have clean links (no parameters) The parameters come only from sorts and filters. We don't have a need for Google to index all of these pages. However, Google Analytics is showing us that over the last year, we received a substantial amount of search revenue from many of these URLs (800+ of them converted) So, Google is telling us they are unhappy. We want to make Google happy by ignoring all of the paramter URLs, but we're worried this will kill the revenue we're seeing. Two questions here: 1. What do we have to lose by keeping everything as-is. Google is giving us errors, but other than that what are the negative repercussions? 2. If we were to de-index all of the parameter URLs via Webmaster tools, how much of the revnenue would likely be recovered by our non-parameter URLs? I've linked to a screenshot from Google Analytics ArxMSMG.jpg
Reporting & Analytics | | znotes0 -
Why My Site Got 1000% increase in organic traffic from day to night?
Did Google run any update Monday or recently? My site www.shirts4geek.com, strangely had a 1000% organic traffic increase from day to night. I didn't do anything in this site for long a time... but Monday I had a lot traffic coming from organic and every other day this week the site is doing extremely well on traffic and sales. I'm ranking first page for many keywords relate to my products. I wish I could figure out what happens so I can replicate it. The site has very links and the On Page Optimization is kind of basic. Does any have any idea how it could be possible? Have any one seen something similar lately?
Reporting & Analytics | | Felip30 -
My GA code is on my site but Google Analytics isn't being pulled into SEOMoz...why?
The CEO wants me to present an SEO plan next week for three of our sites; however, I got this message when I went to campaign overview tab: "It appears there's a problem with our connection to your Google Analytics account. Please go to your Settings page to update your connection." I double-checked the GA code and it's the same on both our site and in SEOMoz...what gives? I clicked on Choose Your GA Profile->Set GA Account and Profile then got this warning: "Are you sure you want to change your Google Analytics connection? Changing your connection will reset our cache of your historical GA traffic data." I need this data pronto so I can set strategy for three sites; any help would be greatly appreciated! Darrell
Reporting & Analytics | | AdviceElle0 -
What impact will Google's 10/18/2011 announcement of 'Making Search More Secure' have on the ability to track specific keyword queries via Analytics?
The full announcement is here: http://googleblog.blogspot.com/2011/10/making-search-more-secure.html My concern is that the ability for Google Analytics to parse information on specific keyword queries will be diminished. The article hints that Google Webmaster Tools will be exempt from the problem, and I've never relied on Webmaster tools as a go-to for tying specific keyword queries to Goal Tracking (form submissions and sales). The community's thoughts on this one are appreciated. 🙂
Reporting & Analytics | | MKR_Agency0