Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do Page Views Matter? (ranking factor?)
-
Hi,
I actually asked it a year and a half ago (with a slight variation) but didn't get any real response and things do change over time.
On my eCommerce website I have the main category pages with client side filtering and sorting. As a result, the number of page views is lower than can be expected.
Do you think having more page views is still a ranking factor? and if so is it more important than user experience?
Thanks
-
Well said - engagement > page views. Google's smart enough to understand that on more complex sites and with more complex technology/JS/etc, those aren't always perfect corollaries for one another.
-
I think there are elements of both iSTORM's and David's responses that are accurate. Page views in and of themselves are almost certainly not a raw ranking factor, but it could well be that engagement metrics that correlate well with page views (in many cases, at least) do have a direct or indirect positive impact on rankings.
I try not to guess at precisely the elements Google is or isn't using to influence the algorithmic rankings (based on what I read about their move to deep learning, it probably doesn't matter much anyway since the algo is becoming derivatives of thousands of metrics' interplays), but instead worry about the things that will cause the results and user experiences Google wants to reward. That was a lot of what this post was about: http://moz.com/blog/seo-correlation-causation.
-
I agree with Ryan that it's more the engagement which is important than the pageviews.
If you have client side filtering & ordering - you could use event tracking in Analytics to get better idea of what visitors are actually doing on your page. Each time a user changes the view, you track an event in Analytics. When you have a high bounce rate on your site, this will also give you a better idea of the actual time spent on a page (remember that there is no measurement of visit duration when a user visits only 1 page and no events are tracked - see also: http://cutroni.com/blog/2012/02/29/understanding-google-analytics-time-calculations/).
-
I agree with this to a certain degree. Page views and user behavior tell Google everything they need to know. No one at Google is manually looking at your site unless you are doing something horribly wrong.
A large amount of page views could signal to the Google bot that the site is popular. Page views combined with long on-site time and low exit rates can tell the bot that the page is not only popular, but also very well put together. (engaging)
-
Rand recently did a whiteboard (beard?) Friday on this ~loosely~ under the broader scope of "Engagement" and I think you have to stick with keeping page views lumped into the overall scope of engagement, i.e., saying X page views per session = Y ranking boost is likely something no one can define precisely.
However, creating an on-site engagement score is something that is loosely feasible. For example you could look at time on site and a divide it by your GWT average time spent downloading a page to give yourself a rating engagement rating that. Lower the download time and you raise your score if the time on site stays the same. Increase time on site and the score goes up as well.
Does the number of page view equate into engagement? Maybe, although a site setup for getting lots of page views (pop culture sites with click lists, news articles, etc.) is going to have more than sites that do the bulk of their business via the home page. Perhaps a page view engagement metric you could create would be derived from your organic bounce rate: http://moz.com/blog/solving-the-pogo-stick-problem-whiteboard-friday
Hopefully this gives you a little direction in what to improve.
-
Pageviews specifically...no. Popularity...yes. User experience is far more important though and Google's approach is based on sites giving users great experience and relevant content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I optimize the login page? Will it affect the website SEO ranking?
I'm trying to resolve the site crawl issues that we have on our website. One of the links that has different issue types together is our login page. Currently we have two login pages that have the same content but different sub domains. **However I'm wondering if optimizing SEO on our login pages affects our website SEO ranking and if it's something better to do or not. ** To point out the details of the issues, the issue types that the logins pages have are "duplicate title", "duplicate content", "missing H1", "missing description", "thin content", "missing canonical tag" I'd appreciate your help, thank you!
Intermediate & Advanced SEO | | Kaylie0 -
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0