Free Media Site / High Traffic / Low Engagement / Strategies and Questions
-
Hi,
Imagine a site "mediapalooza dot com" where the only thing you do there is view free media.
Yet Google Analytics is showing the average view of a media page is about a minute; where the average length of media is 20 - 90 minutes.
And imagine that most of this media is "classic" and that it is generally not available elsewhere.
Note also that the site ranks terribly in Google, despite having decent Domain Authority (in the high 30's), Page Authority in the mid 40's and a great site and otherwise quite active international user base with page views in the tens of thousands per month.
Is it possible that GA is not tracking engagement (time on site) correctly?
Even accounting for the imperfect method of GA that measures "next key pressed" as a way to terminate the page as a way to measure time on page, our stats are truly abysmal, in the tenths of a percentage point of time measured when compared with actual time we think the pages are being used.
If so, will getting engagement tracking to more accurately measure time on specif pages and site signal Google that this site is actually more important than current ranking indicates?
There's lots of discussion about "dwell time" as this relates to ranking, and I'm postulating that if we can show Google that we have extremely good engagement instead of the super low stats that we are reporting now, then we might get a boost in ranking.
Am I crazy? Has anyone got any data that proves or disproves this theory?
as I write this out, I detect many issues - let's have a discussion on what else might be happening here.
We already know that low engagement = low ranking.
Will fixing GA to show true engagement have any noticeable impact on ranking?
Can't wait to see what the MOZZERS think of this!
-
Question, as it was entirely clear in the original question (or I missed it) and I think addressed later ... but if people are coming in and viewing the video without clicking anything (think youtube) then leave, then the time on site and page are not going to register. Is that happening here?
Now to the questions of if engagements rate get better in GA, if that can impact ranking. I have seen no studies on that and I highly doubt Google ties things in your GA account to ranking. Too many people mess up implementations for that. But I have not seen proof either way.
Now, Dwell, or whatever you want to call it, the instance where a user clicks on a result and within a relatively short period of time (as I think it depends on the query) goes back to the same SERP, I think that is taken into account, or is being investigated. That's Google's own data and totally possible to use. Do they? I am not sure and have seen no proof.
-
Thanks for your thoughts.
Been through it all, been doing a thorough site audit for the last couple of months. (that's what I do!)
Ghost and referral spam is somethign that I am very familiar with but it is well less than 1% of all hits.
Fortunately, on this site, it is well in the minority. I see it on other sites and it is nasty there but not an issue here.
I've been solving canonicals, dead ends, low engagement pages, improving pages (many) etc. And with this site there are indeed thousands of issues to deal with, for sure - but this is not the largest site I've worked on, not by a long shot.
This one has been fun. Been doing it for over 10 years on dozens of sites of all sizes.
Time on site is up strongly (generally), as is conversion and general engagement figures.
But those long form media items are still showing extremely poor engagement despite low bounce rates. and I know the system is not tracking them as I am one of my own "customers". I've been actually viewing this content for several months myself, and where I know I'm viewing 30 + 60 minute media for sure, GA is still only recording 2 or 3 minutes each time - and I can clearly see this in the GA data.
Let me give you another clue - many of these items have a zero bounce rate and a zero time on page and 100% exits - (keep in mind the media is many minutes long) what do these telling numbers suggest to you?
...yet despite all this I'm doing, ranking is simply staying near norms - although it is starting to fluctuate more widely than prior norms it is still where it is - and I'm tracking ranking for thousands of terms using 3 different systems.
Normally, I'd be seeing a fairly solid increase after all I've done.
Love to see if we can actually answer the original question if at all possible.
Can poorly configured GA cause low engagement in such a way that if it is fixed, might higher engagement figures drive increasing ranking??
Didn't DWELL get discussed here quite thoroughly?
for backgrounders, this cites Dr Pete.
http://www.searchenginejournal.com/understanding-impact-dwell-time-seo/108905/
-
Hello, my friend.
Have you heard about referral spam and ghost hits? This might be your answer to unreal numbers. Here is a post about it: https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
Also, as it was mentioned above, good DA/PA doesn't mean or guarantee rankings. What about 10000 other things SEO is about?
Also, is time on page the only problem child? everything else is fine? It sounds that you need good analysis of google analytics data.
-
I understand the inverse relationship and there is no question that in reality, there are few that would engage for tens of minutes, just due to the nature of behavior - and the averages bear that out.
But when looking very carefully at this only segment, I would expect more than fractions of a percent to spend more than a mere minute.
Your example shows a 10% view rate (like what we see) and 1800 minutes total use.
In our case, in this exact scenario, GA is only showing about 6 minutes total use.
I think that GA is undercounting dwell time by a reasonably large margin.
That said, stating the question more clearly:
Could it be possible that insufficient or incorrect information regarding actual dwell time on the site might be a factor in the abysmal ranking of this site?
-
There are a few different points here that I think are prudent to make:
-
Having a good/great domain authority has no bearing on the actual quality of the content regarding users. I would be hesitant about making decisions based on two non-correlative data points. Quality in this context refers to the value the average user perceives that content to have.
-
As such, here's an example: Say I have a page hosting a video that's 90 minutes. If 1,000 people visit the page, let's say that 100 came there with an actual interest specifically in that video. Of those 100, maybe 20 will watch the entire thing. So, 20 out of 1,000 people getting to 90 minutes isn't going to give you a high average. This is obviously an abstract example, but it makes the point that video length means nothing as a metric without any insight into these other key numbers.
-
That said, yes, Google is imperfect and won't measure anything perfectly. But a general rule for content of any type is to expect only a certain percentage (usually not very high) to be highly engaged. It's an inverse curve structure in terms of graphical representation.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tracking goals on a subdomain with traffic coming from main domain subfolders. Mission Impossible?
Hello guys, I'm facing a huge challenge setting my goals. Here is the situation. I'm having the same Universal Analytics code on my main domain with different languages www.example.com/en/, www.examples.com/fr/, etc and my www.shop.example.com subdomain. So a French user will go to www.example.com/fr/, see a product that want's to buy and after the registration will see a thank you page on www.shop.example.com/success.html, and here comes the problem. My goal URLs are not "fired" with my filter. I have set up a filter (attached) to include traffic only for the desired profiles, but when this filter is on I can't see conversions. When the filter is off I can see conversions but for all languages. Any ideas how to set up the filter properly? Thank you! Filter.png
Reporting & Analytics | | peternachevat0 -
Site relaunch and impact on SEO
I have some tough decisions to make about a web site I run. The site has seen around for 20 years (September 1995, to be precise, is the date listed against the domain). Over the years, the effort I've expanded on the site has come and gone, but I am about to throw a lot of time and effort back into it. The majority of the content on the site is pretty dated, isn't tremendously useful to the audience (since it's pretty old) and the site design and URL architecture isn't particularly SEO-friendly. In addition, I have a database of thousands vendors (for the specific industry this site serves). I don't know if it's a factor any more but 100% of the links there have been populated by the vendors themselves specifically requesting inclusion (through a form we expose on the site). When the request is approved, the vendor link shows up on the appropriate pages for location (state) and segment of the industry. Though the links are all "opt-in" from vendors (we've never one added or imported any ourselves), I am sure this all looks like a terrible link farm to Google! And some vendors have asked us to remove their link for that reason 🙂 One final (very important) point. We have a relationship with a nationwide brand and have four very specific pages related to that brand on our site. Those pages are essential - they are by far the most visited pages and drive virtually all our revenue. The pages were put together with SEO in mind and the look and feel is very different to the rest of the site. The result is, effectively, a site-within-a-site. I need to carefully protect the performance of these pages. To put some rough numbers on this, the site had 475,000 page views over the last year, with about 320,000 of those being to these four pages (by the way, for the rest of the content "something happened" around May 20th of last year - traffic almost doubled overnight - even though there were no changes to our site). We have a Facebook presence and have put a little effort into that recently (increasing fans from about 10,000 last August to nearly 24,000 today, with a net gain of about 2,500 per month currently). I don't have any sense of whether that is a meaningful resource in the big picture. So, that's the background. I want to totally revamp the broader site - much improved design, intentional SEO decisions, far better, current and active content, active social media presence and so on. I am also moving from one CMS to another (the target CMS / Blog platform being WordPress). Part of me wants to do the following: Come up with a better plan for SEO and basically just throw out the old stuff and start again, with the exception of the four vendor pages I mentioned Implement redirection of the old URLs to new content (301s) Just stop exposing the vendor pages (on the basis that many of the links are old/broken and I'm really not getting any benefit from them) Leave the four important pages exactly as they are (URL and content-wise) I am happy to rebuild the content afresh because I have a new plan around that for which I have some confidence. But I have some important questions. If I go with the approach above, is there any value from the old content / URLs that is worth retaining? How sure can I be there is no indirect negative effect on the four important pages? I really need to protect those pages Is throwing away the vendor links simply all good - or could there be some hidden negative I need to know about (given many of the links are broken and go to crappy/small web sites, I'm hoping this is just a simple decision to make) And one more uber-question. I want to take a performance baseline so that I can see where I started as I start making changes and measure performance over time. Beyond the obvious metrics like number of visitors, time per page, page views per visit, etc what metrics would be important to collect from the outset? I am just at the start of this project and it is very important to me. Given the longevity of the site, I don't know if there is much worth retaining for that reason, even if the content changes radically. At a high level I'm trying to decide what questions I need to answer before I set off on this path. Any suggestions would be very much appreciated. Thanks.
Reporting & Analytics | | MarkWill0 -
Organic Traffic down after WordPress switch.
I recently switched our company's website over to Wordpress in February. Organic traffic went down.... http://www.screencast.com/t/dJ0Oeyma5Xs Same content, there are a few different page URLs but most of all it is all the same and I've SEO'd it to pieces. Any thoughts? I can't figure this one out.
Reporting & Analytics | | SteveZero120 -
Google analytics question
Ok so in my traffic sources break down I have 3 sections: direct, organic, and referral. My question is under the referral tab I have recently noticed a new traffic source, my own website.... How is this possible? My top referring site is my own website.... Is this considered direct traffic or how is this being traced?
Reporting & Analytics | | jameswalkerson0 -
Large Drop in Direct Traffic
We recently experienced a large drop in direct traffic. Search and referral traffic remained steady but direct traffic dropped by over half. I'm having trouble pinpointing what would have caused this drop. Any ideas or suggestions for investigating the cause in a drop of direct traffic?
Reporting & Analytics | | AxlsCloset0 -
How to find out which URLs are NOT indexed on a site
Is there a way to easily find out which URLs on a store-type site are NOT being indexed in Google? For example, if my sitemap information in Google Webmaster tools shows I have 7342 URLs in my sitemap and 5699 of those indexed, how do I find out what the 1643 non-indexed URLS are? Thanks for any help!
Reporting & Analytics | | GregWalt0 -
Google Analytics session update question
Hello, With reference to seomoz blog post - http://www.seomoz.org/blog/panda-24-and-analytics-session-update-rolled-out-simultaneously#jtc151292 , i would like clarification about the following - User searches Google for "Product Name" and clicks on your AdWords advertisement. User leaves site and searches a few more times, click on competition and comparing prices and features. User ultimately decides to with your product, Googles "Your Brand + Product Name", clicks your organic listing, and buys the product. This whole process takes less than 30 minutes. "Your Brand + Product Name" will appear in your organic keyword report with 1 visit. My question is whether "Product Name" will also appear in organic keyword report with 1 visit if the visitor is not signed in. ( as search won't be encrypted ) Thanks
Reporting & Analytics | | seoug_20100 -
Low bounce rate; need help troubleshooting code
I've had an outside developer do a bunch of custom work in Google Analytics to get my site to integrate with Foxycart and accurately report sales in the ecommerce section. With a foxycart upgrade came more GA tweaking, and now my bounce rate is at 1-2%. I know this isn't right, and suspecting there is something triggering GA a second time, causing a second page load, or something. Could someone that loves code look at http://www.strikemodels.com/ and tell me if they can easily spot what's obliterating the accuracy of my bounce rate calculations? Or do I need to go back to my dev and up the can of GA worms to troubleshoot things? As you can tell by the code, I'm running the latest version of WP with a few plugins, Thesis 1.8, and on Apache.
Reporting & Analytics | | KeriMorgret0