Which Algorithm Change Hurt the Site? A causation/correlation issue
-
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally
Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released.
Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases
We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day
-
It's really tough (and even inadvisable) to try to pin a traffic change to an algorithm update based solely on spikes in a graph. On rare occasion, it's pretty clear (Penguin is a good example, I've found), but in most cases there's just a lot of gray areas and the graph leaves out a mountain of data.
The big issue I see here is potentially seasonality and knowing what happened to the site and business. For example, you can look at #6 and #7 and call these dips, but that sort of ignores the spike. Is the dip the anomaly, or is the spike the anomaly? What drove up traffic between #4 and #6? Maybe that simply stopped, was a one-time event, or was seasonal.
Why was there volatility between #7 and #14 and then relative stability after #14? You could call #14 a "drop", but not knowing the timeline, it's hard to see how the curve might smooth in different windows. What it looks like is a period of highly volatile events followed by an evening out.
Without knowing the industry, the business, the history, and without segmenting this data, trying to make claims just based on dips and spikes in the graph is pretty dangerous, IMO. This could have virtually nothing to do with the algorithm, in theory.
-
I don't understand how dates would help? Was it not clear that the red lines are the dates of algo updates?
By abstracting the data the hope was to gain insight into how to read the graphs in relation to updates, and not just get help related to specific updates which wouldn't help much the next time we have to deal with a traffic drop problem. More a question of who to think rather than what to think.
Trying to read between the lines are you saying different algo changes take different amounts of time to kick in and that's why a more detailed graph is more useful? For example if #1 was the first penguin change, would your response be different if it was the first panda change?
-
You can use the Google Penalty Checker tool from Fruition: http://fruition.net/google-penalty-checker-tool/
I would not believe 100% on the tool results, but you can at least have an initial Analise, you'll need to go deeper to double check if this initial Analise is 100% relevant or not.
- Felipe
-
This doesn't tell me anything. If you at least had dates in there you could compare traffic dips to Google Algo Updates/Refreshes.
I understand you can't reveal the domain but I will be shocked if somebody here can tell you anything without further information. This place is full of brilliant minds, but that would take some sort of a mind-reader to tackle...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Avg Page Load Time (sec) Comppared to site average - what does it mean?
Hi All, In google analytic In Site Speed -> Page Timings we have two columns a) Page Views & b) Avg Load Time (sec) compared to site average. Now in "b" column I am able to below % one in green and another in brown so what does it mean? Can anyone please explain me? Image attached Thanks! bNbBA
Reporting & Analytics | | amu1230 -
Using Site Maps Correctly
Hello I'm looking to submit a sitemap for a post driven site with over 5000 pages. The site hasn't got a sitemap but it is indexed by google - will submitting a sitemap make a difference at this stage? Also, most free sitemap tools only go up to 5000 pages, and I'm thinking I would try a sitemap using a free version of the tool before I buy one - If my site is 5500 pages but I only submit a sitemap for 5000 (I have no control of which pages get included in the sitemap) would this have a negative effect for the pages that didn't get included? Thanks
Reporting & Analytics | | wearehappymedia0 -
Tracking Clicks on a Global Header Across Multiple Sites
Hey All, A particular client has multiple websites and we're planning on implementing a global header across 15+ sites. I've been looking for a way to track the clicks on this global header across all sites (that is that they are summed up), what's the best way to go about this if I am using Google Analytics (I know Adobe site catalyst could do this no problem with some advanced tweaking), any ideas? I could do the general click tracking route and tag every link but that will only help me if I do that for each site (that being said, if the global header for all sites pulls from a single HTML, then tagging it would technically count all the clicks from all the sites, the only caveat being that I'd have to pick which Google analytics profile I'd want to track the header with). Thoughts? Thanks!
Reporting & Analytics | | EvansHunt0 -
How can you tell if Google has already assessed a penalty against your site for spammy links?
Is there any way to tell for sure if there is a penalty? My client has a ton of low quality back links, and I think they are in danger of a Penguin penalty. Any way to know? The links are there for a business reason.... their clients mention them in the footer, with a backlink. It is not a link scheme. but folks are generally not clicking on a footer link, and so there is a pro/con of leaving it as it. Any way, to diagnose whether a Penguin penalty has already hit?
Reporting & Analytics | | DianeDP2 -
URL Structure Q - /UniqueURL/ProductA or /SubcategoryURL/ProductA?
Hi Mozers, I have a niche ecommerce site http://www.ecustomfinishes.com that sells custom barn wood furniture. I have about 600 products online. 2 weeks ago I started rewriting my urls from /subcategoryurl/ProductA to /UNIQUEURL/productA for my individual products, For example for my subcategory farm tables (150 products) I had /rustic-farm-tables/productA, /rustic-farm-tables/ProductB ...."rustic-farm-table" about 150 times. 2 weeks ago I started changing the 150x "/rustic-farm-table/" to a more descriptive URL such as /white-farm-table/producA /rustic-square-dining-table/ProductB /Black-harvest-table/ProductC Here is why I am need advice: I have 1181 pages, the page with the most entrances with "rustic-farm-tables" is #31/1181 based on entrances. the 2nd most is #71/1181 Alternatively, I have 13 table product pages such a as /12ft-Rustic-Farm-Dining-Table-p/12-foot-table-with-inlay.htm" that get more entrances than any product that includes "rustic-farm-tables" Since changing the urls to be product specific, my overall traffic has dropped 20%!!! So here is my question: do i continue to have the /UNIQUEURL/product be unique to the product, which is consistant amongst my best preforming pages, yet has dropped my traffic 20% in the last 2 weeks, OR do i keep /SAME-URL/product which written as a best practice, and be happy with the traffic I had? Could the 20% drop just be a temporary shock? Why would this happen? This would be a good long tail/head term experiment. Try to get more head terms, or do what you can do focus on long tail. I hope i was able to explain this well, I say follow the best practices of my best preforming pages, however the 20% drop has me worried. Thank you in advance for your help
Reporting & Analytics | | longdenc_gmail.com0 -
Filter out IP address of Site Search analytics
Hi Mozzers, I have a filter that excludes all internal traffic from my sites. But this does not seem to work on site search > Search Terms See here:- http://productforums.google.com/forum/#!searchin/analytics/filter$20site$20search/analytics/pO18L31hEO4/tJ3lKVNT3YYJ Any ideas? Or is it a bug, etc Thanks S
Reporting & Analytics | | Metropolis0 -
.com version and .org version of site
So i just discovered that a site I now managae has a .com version - as well as the .org version that is the one everyone knows about! I'm guessing this is not a good thing... So the whole site eg www.abc.org/example has a mirror page www.abc.com/example.... What should I do about this? Is it really bad to have 2 versions out there? Thanks!
Reporting & Analytics | | inhouseninja0 -
For an optimized site, any available stats / guesstimates on what is avg % of traffic to homepage vs. second-level pages?
I'm interested in passing this info on to a client who experienced a period of time when an incorrect GA code was installed on their homepage. They were able to get Google stats on second level pages only. This is a site that gets 80 + % of visits from organic search engine referrals. They do minimal advertising. Thanks in advance.
Reporting & Analytics | | alankoen1230