Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Getting Pages Requiring Login Indexed
-
Somehow certain newspapers' webpages show up in the index but require login. My client has a whole section of the site that requires a login (registration is free), and we'd love to get that content indexed. The developer offered to remove the login requirement for specific user agents (eg Googlebot, et al.). I am afraid this might get us penalized.
Any insight?
-
My guess: It's possible, but it would be an uphill battle. The reason being Google would likely see the page as a duplicate of all the other pages on your site with a login form. Not only does Google tend to drop duplicate pages from it's index (especially if it has a duplicate title tag - more leeway is giving the more unique elements you can place on a page) but now you face a situation where you have lots of duplicate or "thin" pages, which is juicy meat for a Panda-like penalty. Generally, you want to keep this pages out of the index, so it's a catch 22.
-
That makes sense. I am looking into whether any portion of our content can be made public in a way that would still comply with industry regulations. I am betting against it.
Does anyone know whether a page requiring login like this could feasibly rank with a strong backlink profile or a lot of quality social mentions?
-
The reason Google likes the "first click free" method is because they want the user to have a good result. They don't want users to click on a search result, then see something else on that page entirely, such as a login form.
So technically showing one set of pages to Google and another to users is considered cloaking. It's very likely that Google will figure out what's happening - either through manual review, human search quality raters, bounce rate, etc - and take appropriate actions against your site.
Of course, there's no guarantee this will happen, and you could argue that the cloaking wasn't done to deceive users, but the risk is high enough to warrant major consideration.
Are there any other options for displaying even part of the content, other than "first-click-free"? For example, can you display a snippet or few paragraphs of the information, then require login to see the rest? This at least would give Google something to index.
Unfortunately, most other methods for getting anything indexed without actually showing it to users would likely be considered blackhat.
Cyrus
-
Should have read the target:
"Subscription designation, snippets only: If First Click Free isn't a feasible option for you, we will display the "subscription" tag next to the publication name of all sources that greet our users with a subscription or registration form. This signals to our users that they may be required to register or subscribe on your site in order to access the article. This setting will only apply to Google News results.
If you prefer this option, please display a snippet of your article that is at least 80 words long and includes either an excerpt or a summary of the specific article. Since we do not permit "cloaking" -- the practice of showing Googlebot a full version of your article while showing users the subscription or registration version -- we will only crawl and display your content based on the article snippets you provide. If you currently cloak for Googlebot-news but not for Googlebot, you do not need to make any changes; Google News crawls with Googlebot and automatically uses the 80-word snippet.
NOTE: If you cloak for Googlebot, your site may be subject to Google Webmaster penalties. Please review Webmaster Guidelines to learn about best practices."
-
"In order to successfully crawl your site, Google needs to be able to crawl your content without filling out a registration form. The easiest way to do this is to configure your webservers not to serve the registration page to our crawlers (when the user-agent is "Googlebot") so that Googlebot can crawl these pages successfully. You can choose to allow Googlebot access to some restricted pages but not others. More information about technical requirements."
-http://support.google.com/webmasters/bin/answer.py?hl=en&answer=74536
Any harm in doing this while not implementing the rest of First Click Free??
-
What would you guys think about programming the login requirement behavior in such a way that only Google can't execute it--so Google wouldn't know that it is the only one getting through?
Not sure whether this is technically possible, but if it were, would it be theoretically likely to incur a penalty? Or is it foolish for other reasons?
-
Good idea--I'll have to determine precisely what I can and cannot show publicly and see if there isn't something I can do to leverage that.
I've heard about staying away from agent-specific content, but I wonder what the data are and whether there are any successful attempts?
-
First click free unfortunately won't work for us.
How might I go about determining how adult content sites handle this issue?
-
Have you considered allowing only a certain proportion of each page to show to any visitors including search engines. This way your pages will have some specific content that can be indexed and help you rank in the SERPs.
I have seen it done where publications behind a pay wall only allow the first paragraph or two to show - just enough to get them ranked appropriately but not enough to stop user wanting to register to access the full articles when they find them either through the SERPs, other sites or directly.
However for this to work it all depends on what the regualtions you mention require - would a proportion of the content being shown to all be ok??
I would definitely stay away from serving up different content to different users if I were you as this is likely to end up causing you trouble in the search engines..
-
I believe newspapers use a feature called "first click free" that enables this to work. I don't know if that will work with your industry regulations or not, however. You may also want to see how sites that deal with adult content, such as liquor sites, have a restriction for viewing let allow indexing.
-
Understood. The login requirement is necessary for compliance with industry regulations. My questions is whether I will be penalized for serving agent-specific content and/or whether there is a better way to get these pages in the index.
-
Search engines aren't good at completing online forms (such as a login), and thus any content contained behind them may remain hidden, so the developers option sounds like a good solution.
You may want to read:
http://www.seomoz.org/beginners-guide-to-seo/why-search-engine-marketing-is-necessary
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
React.js Single Page Application Not Indexing
We recently launched our website that uses React.js and we haven't been able to get any of the pages indexed. Our previous site (which had a .ca domain) ranked #1 in the 4 cities we had pages and we redirected it to the .com domain a little over a month ago. We have recently started using prerender.io but still haven't seen any success. Has anyone dealt with a similar issue before?
Intermediate & Advanced SEO | | m_van0 -
E-Commerce Site Collection Pages Not Being Indexed
Hello Everyone, So this is not really my strong suit but I’m going to do my best to explain the full scope of the issue and really hope someone has any insight. We have an e-commerce client (can't really share the domain) that uses Shopify; they have a large number of products categorized by Collections. The issue is when we do a site:search of our Collection Pages (site:Domain.com/Collections/) they don’t seem to be indexed. Also, not sure if it’s relevant but we also recently did an over-hall of our design. Because we haven’t been able to identify the issue here’s everything we know/have done so far: Moz Crawl Check and the Collection Pages came up. Checked Organic Landing Page Analytics (source/medium: Google) and the pages are getting traffic. Submitted the pages to Google Search Console. The URLs are listed on the sitemap.xml but when we tried to submit the Collections sitemap.xml to Google Search Console 99 were submitted but nothing came back as being indexed (like our other pages and products). We tested the URL in GSC’s robots.txt tester and it came up as being “allowed” but just in case below is the language used in our robots:
Intermediate & Advanced SEO | | Ben-R
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkout
Disallow: /9545580/checkouts
Disallow: /carts
Disallow: /account
Disallow: /collections/+
Disallow: /collections/%2B
Disallow: /collections/%2b
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b
Disallow: /design_theme_id
Disallow: /preview_theme_id
Disallow: /preview_script_id
Disallow: /apple-app-site-association
Sitemap: https://domain.com/sitemap.xml A Google Cache:Search currently shows a collections/all page we have up that lists all of our products. Please let us know if there’s any other details we could provide that might help. Any insight or suggestions would be very much appreciated. Looking forward to hearing all of your thoughts! Thank you in advance. Best,0 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
Redirected Old Pages Still Indexed
Hello, we migrated a domain onto a new Wordpress site over a year ago. We redirected (with plugin: simple 301 redirects) all the old urls (.asp) to the corresponding new wordpress urls (non-.asp). The old pages are still indexed by Google, even though when you click on them you are redirected to the new page. Can someone tell me reasons they would still be indexed? Do you think it is hurting my rankings?
Intermediate & Advanced SEO | | phogan0 -
How long does google take to show the results in SERP once the pages are indexed ?
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions - A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap. 1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ? 2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ? 3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page) An answer from SEO experts will be highly appreciated. Thnx !
Intermediate & Advanced SEO | | PepMozBot0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
Intermediate & Advanced SEO | | EricPacifico0