Indexation of content from internal pages (registration) by Google
-
Hello,
we are having quite a big amount of content on internal pages which can only be accessed as a registered member.
What are the different options the get this content indexed by Google?
In certain cases we might be able to show a preview to visitors. In other cases this is not possible for legal reasons.
Somebody told me that there is an option to send the content of pages directly to google for indexation. Unfortunately he couldn't give me more details. I only know that this possible for URLs (sitemap). Is there really a possibility to do this for the entire content of a page without giving google access to crawl this page?
Thanks
Ben
-
The issue is that Google won't and shouldn't index pages that are restricted.
This is best for user experience. Most people won't sign in to view the content.
You basically have to create two sites. One that is visible to all, and Google where you show or preview a bit. then the other that is protected.
-
Thanks, I will check wether this meets the legal requirements (see my reply to Brents answer).
-
As I mentioned we have 2 cases.
In the first case, we can show a preview.
In the second case we can only show the content to a certain audience (which is a legal question). So the free registration is a legal requirement. Still people will be looking for it via Google. Since the content found on those pages is useful for a fairly large audience so why wouldn't we want Google to index the pages. Of course without Google knowing that there is relevant content on those pages, they will neither index nor propperly rank those pages.
-
I found this information for you but you should definitely check that it doesn't break any of Google's guidelines before incorporating it to your website.
This is a simple code to allow bots to bypass the password on password protected pages
$allow_inside = ($is_logged_in) || substr_count($_SERVER['HTTP_USER_AGENT'],'Googlebot');
http://davidwalsh.name/google-password-protected-areas
The reference post is older, so this code could have been updated
-
Fetch as Google Bot is submit to index. This is why I believe it should work with it.
-
I guess my questions is why would you want Google to index something that is only available to registered users?
In order for it to be indexed, it has to be open to everyone.
You will have to figure out what can be shown as a preview and what can't. If you want something to be indexed, then you will have to create a separate section for your preview content (since Google won't index your protected content.)
-
Hi Istvan,
"The Fetch as Googlebot tool lets you see a page as Googlebot sees it."
Since Googlebot has no access to the entire site (login required) it will probably not display anything (just tried it logged in and it would not display any of the content). How could this tool theoretically help us indexing the content of the internal page?
Ben
-
Hi Ben,
Maybe fetch as Google Bot can be a solution to your issue. But not 100% sure of this.
Gr.,
Istvan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on product pages
Hi, We are considering the impact when you want to deliver content directly on the product pages. If the products were manufactured in a specific way and its the same process across 100 other products you might want to tell your readers about it. If you were to believe the product page was the best place to deliver this information for your readers then you could potentially be creating mass content duplication. Especially as the storytelling of the product could equate to 60% of the page content this could really flag as duplication. Our options would appear to be:1. Instead add the content as a link on each product page to one centralised URL and risk taking users away from the product page (not going to help with conversion rate or designers plans)2. Put the content behind some javascript which requires interaction hopefully deterring the search engine from crawling the content (doesn't fit the designers plans & users have to interact which is a big ask)3. Assign one product as a canonical and risk the other products not appearing in search for relevant searches4. Leave the copy as crawlable and risk being marked down or de-indexed for duplicated contentIts seems the search engines do not offer a way for us to serve this great content to our readers with out being at risk of going against guidelines or the search engines not being able to crawl it.How would you suggest a site should go about this for optimal results?
Intermediate & Advanced SEO | | FashionLux2 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Index or not index Categories
We are using Yoast Seo plugin. On the main menu we have only categories which has consist of posts and one page. We have category with villas, category with villa hotels etc. Initially we set to index and include in the sitemap posts and excluded categories, but I guess it was not correct. Would be a better way to index and include categories in the sitemap and exclude the posts in order to avoid the duplicate? It somehow does not make sense for me, If the posts are excluded and the categories included, will not then be the categories empty for google? I guess I will get crazy of this. Somebody has perhaps more experiences with this?
Intermediate & Advanced SEO | | Rebeca10 -
Should pages of old news articles be indexed?
My website published about 3 news articles a day and is set up so that old news articles can be accessed through a "back" button with articles going to page 2 then page 3 then page 4, etc... as new articles push them down. The pages include a link to the article and a short snippet. I was thinking I would want Google to index the first 3 pages of articles, but after that the pages are not worthwhile. Could these pages harm me and should they be noindexed and/or added as a canonical URL to the main news page - or is leaving them as is fine because they are so deep into the site that Google won't see them, but I also won't be penalized for having week content? Thanks for the help!
Intermediate & Advanced SEO | | theLotter0 -
Google Page Rank Dead?
Does PR still work? I have sites that have PR3 and get almost no traffic and sites that are PR1 and get thousands of uniques per month. My PR on my main sites haven't moved for about 7 years, even though we've grown significantly. I know lots of you are going to jump in with get the MOZ toolbar, which I already have done, and I agree, it's great ... But can anyone tell me about what's going on with Google PR? Is it still active? Or has Google abandoned? I noticed that the Google toolbar is not even available for Google Chrome. That should say something ... If you like this question, do me a favor, and give me a THUMBS UP!
Intermediate & Advanced SEO | | applesofgold2 -
Why duplicate content for same page?
Hi, My SEOMOZ crawl diagnostic warn me about duplicate content. However, to me the content is not duplicated. For instance it would give me something like: (URLs/Internal Links/External Links/Page Authority/Linking Root Domains) http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110516 /1/1/31/2 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110711 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110811 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110911 0/0/1/0 Why is this seen as duplicate content when it is only URL with campaign tracking codes to the same content? Do I need to clean this?Thanks for answer
Intermediate & Advanced SEO | | nuxeo0 -
Removing a Page From Google index
We accidentally generated some pages on our site that ended up getting indexed by google. We have corrected the issue on the site and we 404 all of those pages. Should we manually delete the extra pages from Google's index or should we just let Google figure out that they are 404'd? What the best practice here?
Intermediate & Advanced SEO | | dbuckles0