Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Click Through Rate on Password Protected Pages
-
Hi Moz community,
I have a website that has a large database with 800+ important pages, and want Google to know when people visit and stay on these pages. However, these pages are only accessible to people once they create an account with a password, and sign in.
I know that since these pages are password protected, Google doesn't index them, but when our visitors stay for a while on our site browsing through our database, does this data get included in our CTR and Bounce Rate by Google? This is really important for Google to know about our database (that people are staying on our site for a while) for SEO purposes, so I wanted to know that if the CTR gets measured even though these pages aren't crawled.
Thanks for the help!!
-
Awesome thank you for your help. Definitely clears things up.
-
Hey Daniel,
Google will get its insight from your Analytics. Again, the spiders crawling your site are for indexing. However, the Analytics code is going to be capturing all of that valuable user data...and you have just as much access to it as Google does.
Consider this: The crawler will have absolutely no idea who is on your page or what they are doing...that is not their purpose. Therefore, even pages that are being indexed are not providing that data to Google.
I hope that clears up any confusion!
-
Christopher thank you for your response! I'm more concerned with Google themselves tracking our CTR, thus enhancing our search results for our site overall without being able to index the specific database pages. Does that make sense? Do you know about this?
-
As long as your Analytics code is on your password protected pages, it should be able to track your visitors (not sure if this is possible in your database, but perhaps it could be embedded in your header). Google's crawling is meant to index your pages in search results, but has nothing to do with your user tracking.
If you have access to the code used on your protected pages, insert your code and let me know if it picks up your visitors! Easy to test, just have someone log in and see if there are any current active users onsite. If so - your data should be golden.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Too many on page links
Hi I know previously it was recommended to stick to under 100 links on the page, but I've run a crawl and mine are over this now with 130+ How important is this now? I've read a few articles to say it's not as crucial as before. Thanks!
Intermediate & Advanced SEO | | BeckyKey1 -
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
When should you 410 pages instead of 404
Hi All, We have approx 6,000 - 404 pages. These are for categories etc we don't do anymore and there is not near replacement etc so basically no reason or benefit to have them at all. I can see in GWT , these are still being crawled/found and therefore taking up crawler bandwidth. Our SEO agency said we should 410 these pages?.. I am wondering what the difference is and how google treats them differently ?. Do anyone know When should you 410 pages instead of 404 ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
What if page exists for desktop but not mobile?
I have a domain (no subdomains) that serves up different dynamic content for mobile/desktop pages--each having the exact same page url, kind of a semi responsive design, and will be using "Vary: User-Agent" to give Google a heads up on this setup. However, some of the pages are only valid for mobile or only valid for desktop. In the case of when a page is valid only for mobile (call it mysite.com/mobile-page-only ), Google Webmaster Tools is giving me a soft 404 error under Desktop, saying that the page does not exist, Apparently it is doing that because my program is actually redirecting the user/crawler to the home page. It appears from the info about soft 404 errors that Google is saying since it "doesn't exist" I should give the user a 404 page--which I can make it customized and give the user an option to go to the home page, or choose links from a menu, etc.. My concern is that if I tell the desktop bot that mysite.com/mobile-page-only basically is a 404 error (ie doesn't exist), that it could mess up the mobile bot indexing for that page--since it definitely DOES exist for mobile users.. Does anyone here know for sure that Google will index a page for mobile that is a 404 not found for desktop and vice versa? Obviously it is important to not remove something from an index in which it belongs, so whether Google is careful to differential the two is a very important issue. Has anybody here dealt with this or seen anything from Google that addresses it? Might one be better off leaving it as a soft 404 error? EDIT: also, what about Bing and Yahoo? Can we assume they will handle it the same way? EDIT: closely related question--in a case like mine does Google need a separate sitemap for the valid mobile pages and valid desktop pages even though most links will be in both? I can't tell from reading several q&a on this. Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
Putting "noindex" on a page that's in an iframe... what will that mean for the parent page?
If I've got a page that is being called in an iframe, on my homepage, and I don't want that called page to be indexed.... so I put a noindex tag on the called page (but not on the homepage) what might that mean for the homepage? Nothing? Will Google, Bing, Yahoo, or anyone else, potentially see that as a noindex tag on my homepage?
Intermediate & Advanced SEO | | Philip-DiPatrizio0 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
Intermediate & Advanced SEO | | EricPacifico0