Which Algorithm Change Hurt the Site? A causation/correlation issue
-
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally
Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released.
Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases
We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day
-
It's really tough (and even inadvisable) to try to pin a traffic change to an algorithm update based solely on spikes in a graph. On rare occasion, it's pretty clear (Penguin is a good example, I've found), but in most cases there's just a lot of gray areas and the graph leaves out a mountain of data.
The big issue I see here is potentially seasonality and knowing what happened to the site and business. For example, you can look at #6 and #7 and call these dips, but that sort of ignores the spike. Is the dip the anomaly, or is the spike the anomaly? What drove up traffic between #4 and #6? Maybe that simply stopped, was a one-time event, or was seasonal.
Why was there volatility between #7 and #14 and then relative stability after #14? You could call #14 a "drop", but not knowing the timeline, it's hard to see how the curve might smooth in different windows. What it looks like is a period of highly volatile events followed by an evening out.
Without knowing the industry, the business, the history, and without segmenting this data, trying to make claims just based on dips and spikes in the graph is pretty dangerous, IMO. This could have virtually nothing to do with the algorithm, in theory.
-
I don't understand how dates would help? Was it not clear that the red lines are the dates of algo updates?
By abstracting the data the hope was to gain insight into how to read the graphs in relation to updates, and not just get help related to specific updates which wouldn't help much the next time we have to deal with a traffic drop problem. More a question of who to think rather than what to think.
Trying to read between the lines are you saying different algo changes take different amounts of time to kick in and that's why a more detailed graph is more useful? For example if #1 was the first penguin change, would your response be different if it was the first panda change?
-
You can use the Google Penalty Checker tool from Fruition: http://fruition.net/google-penalty-checker-tool/
I would not believe 100% on the tool results, but you can at least have an initial Analise, you'll need to go deeper to double check if this initial Analise is 100% relevant or not.
- Felipe
-
This doesn't tell me anything. If you at least had dates in there you could compare traffic dips to Google Algo Updates/Refreshes.
I understand you can't reveal the domain but I will be shocked if somebody here can tell you anything without further information. This place is full of brilliant minds, but that would take some sort of a mind-reader to tackle...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why would on-site search queries show up as referral traffic?
The site analytics have been set up for over a year and suddenly last month there was a huge spike in referral traffic (1100+ sessions). Upon further investigation, the majority of it was coming directly from internally, either as mysite.com or search.mysite.com and the landing pages from the referrals are all /search.html?query=* This was never an issue before so I'm trying to understand what could have changed. I'm following up with the client to find out if their dev team may have changed anything related to their search engine but I'm wondering if there may be another explanation. A few notes: previously mysite.com / search.mysite.com were not in the Referral Exclusion list. I've added them now but this was never an issue before. Thanks in advance!
Reporting & Analytics | | SEMnMs0 -
Changes to cookies, August 24 or 25? (organic search not being cookied?)
Hi folks. We've seen a precipitous change in the referral data that we're able to gather from cookies. Specifically: On August 24 or 25, traffic to our site that was cookied as coming from organic search dropped by ~25%, and traffic coming in with no referrer data at all (i.e., it appears to be "direct") rose by roughly the same amount. As far as we can see, we haven't changed anything in our systems that would have caused this (we're not just mis-reading the cookie info), so I'm looking for external reasons. Has anyone else seen this? Or have any ideas why it would happen?
Reporting & Analytics | | RobM4160 -
Google Analytics Set-Up for site with both http & https pages
We have a client that migrated to https last September. The site uses canonicals pointing to the https version. The client IT team is reluctant to put 301 redirects from the non-secure to the secure and we are not sure why they object. We ran a screaming frog report and it is showing both URLs for the same page (http and https). The non-secure version has a canonical pointing to the secure version. For every secure page there is a non-secure version in ScreamingFrog so Google must be ignoring the canonical and still indexing the page however, when we run a site: we see that most URLs are the secure version. At that time we did not change the Google Analytics setup option to use: "https" instead of "http" BUT GA appears to be recording data correctly. Yesterday we set up a new profile and selected "https" but our question is: Does the GAnalytics http/https version make a difference if so, what difference is it?
Reporting & Analytics | | RosemaryB1 -
Large event site - how should I structure my URLs?
Hi guys, I'm working on a new website which is consolidating a number of existing event sites into one. The existing sites use a variety of URL structures: www.eventsite1.com/events/event-name www.eventsite2.com/festival-program/event-name www.eventsite3.com/event-name This inconsistency has led to issues with tracking category usage properly in analytics - for instance, with eventsite3.com, events fall within categories (www.eventsite3.com/category-name) but as soon as you drill into an event detail page (www.eventsite3.com/event-name) from the category page, the category is lost to analytics. This is compounded when one event lives within multiple categories, as I can't figure out which category is the most effective for a particular event. I've seen other event sites establish a canonical URL for a primary category, display it in the URL (i.e. www.eventsite4.com/primary-category/event-name) yet still let that event get hit via the secondary categories (www.eventsite4.com/secondary-category/event-name). This way, the categories get passed to analytics without any duplicate content issues (i.e. via the setting of canonicals) Basically, I want to make sure that whatever instruction I give to the devs for the new site re: URL structure is correct from an SEO perspective and analytics perspective. Do I even need to worry about having the category in the URL? Can someone please help me with this? Hope this makes sense Cheers
Reporting & Analytics | | cos20300 -
Having Issue with Site Search in Analytics
Hi Mozzers, We launched a website in October 2012 and have added in the settings(Google analytics) of that profile "Do Track Site Search" since we have a search box on the website. The site search report worked for 10 days and it was over(from end of december till beginning of January 2013). Since then I have been trying to understand this issue. I have added all the query search terms possible, but still not showing any signs of life. At this point I am not sure what to do? Some Help would be appreciated! Search URL= subdomain.example.com**/search/node/**.... Thanks! z93cGUZ.png
Reporting & Analytics | | Ideas-Money-Art0 -
A/B Tests: How To Verify Difference In Average Order Value?
Hi there! When the data from an A/B test shows a difference in AOV between the variants, how do you determine if this difference is statistically probable or not? Best Regards, Martin
Reporting & Analytics | | TalkInThePark0 -
KissMetrics Experience / Feedback
Hi Mozzers. Can anyone share their experiences using KissMetrics: pros, cons, how it compares to Google Analytics etc? Thanks, I appreciate any input.
Reporting & Analytics | | David_ODonnell0 -
Can you use Google tag manager to manage rich snippets/schema mark up?
Hi,Does anyone know if you can use the new Google tag manager to manage rich snippets? I've seen that there is an HTML section where you can edit HTML that isn't shown on the site - do you think this field could be used to add schema data?Thanks,Karen
Reporting & Analytics | | Digirank0