Password Protected Page(s) Indexed
-
Hi,
I am wondering if my website can get a penalty if some password protected pages are showing up when I search on google:
site:www.example.com/sub-group/pass-word-protected-page
That shows that my password protected page was indexed either before or after adding the password protection.
I've seen people suggest no indexing the page. Is that the best method to take care of this? What if we are planning on pushing the page live later on? All of these pages have no title tag, meta description, image alt text, etc. Should I add them for each page? I am wondering what is the best step, especially if we are planning on pushing the page(s) live.
Thanks for any help!
-
Thanks. I ended up "no indexing" those pages. Someone else told me not to "no follow", so I changed it back to "follow". Haven't really seen any changes on the SERPs from doing this, so not sure if these pages were causing any issues.
Thanks again.
-
Hi,
I would "no follow" and "no index" those pages. You really don't want them showing up in the index anyway. They are not helping your seo. Later on when you want those pages to be active and indexed you could just change your tags and update your site maps.
Best Regards.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How would you handle these pages? Should they be indexed?
If a site has about 100 pages offering specific discounts for employees at various companies, for example... mysite.com/discounts/target mysite.com/discounts/kohls mysite.com/discounts/jcpenney and all these pages are nearly 100% duplicates, how would you handle them? My recommendation to my client was to use noindex, follow. These pages tend to receive backlinks from the actual companies receiving the discounts, so obviously they are valuable from a linking standpoint. But say the content is nearly identical between each page; should they be indexed? Is there any value for someone at Kohl's, for example, to be able to find this landing page in the search results? Here is a live example of what I am talking about: https://www.google.com/search?num=100&safe=active&rlz=1C1WPZB_enUS735US735&q=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&oq=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&gs_l=serp.3...7812.8453.0.8643.6.6.0.0.0.0.174.646.3j3.6.0....0...1c.1.64.serp..0.5.586...0j35i39k1j0i131k1j0i67k1j0i131i67k1j0i131i46k1j46i131k1j0i20k1j0i10i3k1.RyIhsU0Yz4E
Intermediate & Advanced SEO | | FPD_NYC0 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Website Does not index in any page?
I created a website www.astrologersktantrik.com 4 days ago and fetch it with google but still my website does not index on google as the keywords I use is with low competition but still my website does not appear on any keywords?
Intermediate & Advanced SEO | | ramansaab0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Should you allow an auto dealer's inventory to be indexed?
Due to the way most auto dealership website populate inventory pages, should you allow inventory to be indexed at all? The main benefit us more content. The problem is it creates duplicate, or near duplicate content. It also creates a ton of crawl errors since the turnover is so short and fast. I would love some help on this. Thanks!
Intermediate & Advanced SEO | | Gauge1230 -
Webmaster Index Page significant drop
Has anyone noticed a significant drop in indexed pages within their Google Webmaster Tools sitemap area? We went from 1300 to 83 from Friday June 23 to today June 25, 2012 and no errors are showing or warnings. Please let me know if anyone else is experiencing this and suggestions to fix this?
Intermediate & Advanced SEO | | datadirect0 -
How do I index these parameter generated pages?
Hey guys, I've got an issue with a site I'm working on. A big chunk of the content (roughly 500 pages) is delivered using parameters on a dynamically generated page. For example: www.domain.com/specs/product?=example - where "example' is the product name Currently there is no way to get to these pages unless you enter the product name into the search box and access it from there. Correct me if I'm wrong, but unless we find some other way to link to these pages they're basically invisible to search engines, right? What I'm struggling with is a method to get them indexed without doing something like creating a directory map type page of all of the links on it, which I guess wouldn't be a terrible idea as long as it was done well. I've not encountered a situation like this before. Does anyone have any recommendations?
Intermediate & Advanced SEO | | CodyWheeler0 -
Category Pages - Canonical, Robots.txt, Changing Page Attributes
A site has category pages as such: www.domain.com/category.html, www.domain.com/category-page2.html, etc... This is producing duplicate meta descriptions (page titles have page numbers in them so they are not duplicate). Below are the options that we've been thinking about: a. Keep meta descriptions the same except for adding a page number (this would keep internal juice flowing to products that are listed on subsequent pages). All pages have unique product listings. b. Use canonical tags on subsequent pages and point them back to the main category page. c. Robots.txt on subsequent pages. d. ? Options b and c will orphan or french fry some of our product pages. Any help on this would be much appreciated. Thank you.
Intermediate & Advanced SEO | | Troyville0