Can Google read content/see links on subscription sites?
-
If an article is published on The Times (for example), can Google by-pass the subscription sign-in to read the content and index the links in the article?
Example: http://www.thetimes.co.uk/tto/life/property/overseas/article4245346.ece
In the above article there is a link to the resort's website but you can't see this unless you subscribe. I checked the source code of the page with the subscription prompt present and the link isn't there.
Is there a way that these sites deal with search engines differently to other user agents to allow the content to be crawled and indexed?
-
Hey Matt,
The best way to tell what the news organization or site is using is to turn off javascript or view the google cache to determine how Google "sees" the page.
This article is using the second option in the article I mentioned - snippets. Here is what the article has to say about that:
"If you prefer this option, please display a snippet of your article that is at least 80 words long and includes either an excerpt or a summary of the specific article." -
Thanks Dan, it doesn't look like the example article is using first click free. So I guess the answer is no, Google can't read the hidden content in this example?
-
Great question! Yes, Google has an effective way to deal with this since 2007. The three ways they deal with this include first click free, subscription designation, and then disallowing content. Here is their official support article on it:
https://support.google.com/news/publisher/answer/40543?hl=en
Here is a quote from the help article:
"To summarize, we will crawl and index your site to the extent that you allow Googlebot to access it. In order to provide the best possible user experience and help more users discover your content, we encourage you to try First Click Free. If you prefer to limit access to your site to subscribers only, we will respect your decision and show a “subscription” label next to your links on Google News."Here is what Matt Cutts said about it in an interview with Search Engine Land:
"First Click Free originated with Google News, but you can use the same way of handling content in web search (show the same page to users and Googlebot, then if the user clicks to read a different article, then you can show them the registration or pay page). Because the same page is presented to users and to Googlebot, it’s not cloaking. So First Click Free is a great way if you have premium content to surface it in Google’s web index without cloaking. Hope that makes sense."It is possible to allow the Googlebot to access the content and simultaneously NOT provide it for free to non-subscribers. The above help article above should answer all of your questions. Hope this helps!
-
I would say no. The content of the article other than what is seen is not in the source code. They could be showing something different to Google, but if they did it would be against Google's terms of service. https://support.google.com/webmasters/answer/66355?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can one back-link fluctuates ranking of website with thousands of back-links?
It happend to our website. We have seen major ranking fluctuations for our website because of one back-link. What kind of links those can be? Why Google is not stopping them even though they claim that such back-links will be taken care of?
Intermediate & Advanced SEO | | vtmoz0 -
Inbound Affiliate Links: can this solution help?
Hello everyone, I have a pretty large e-commerce website and a bunch (about 1,000) affiliates using our in-house affiliate system we built several years ago (about 12 years ago?). All our affiliates link to us as follows: http://mywebsite.com/page/?aff=[aff_nickname] Then our site parses the request, stores a cookie to track the user, then 301 redirects to the clean page URL below: http://mywebsite.com/page/ Since 2013 we require all affiliates to link to us by using the rel="nofollow" tag to avoid any penalties, but I still see a lot of affiliate links not using the nofollow or old affiliates that have not updated their pages. So... I was reading on this page from Google, that any possible "scheme" penalization can be fixed by using either the nofollow tag or by using an intermediate page listed on the robots.txt file: https://support.google.com/webmasters/answer/66356?hl=en Do you think that could really be a reliable solution to avoid any possible penalization coming from affiliate links not using the "nofollow" tag? I have searched and read around the web but I couldn't find any real answer to my question. Thanks in advance to anyone. Best, Fab.
Intermediate & Advanced SEO | | fablau0 -
Something happened within the last 2 weeks on our WordPress-hosted site that created "duplicates" by counting www.company.com/example and company.com/example (without the 'www.') as separate pages. Any idea what could have happened, and how to fix it?
Our website is running through WordPress. We've been running Moz for over a month now. Only recently, within the past 2 weeks, have we been alerted to over 100 duplicate pages. It appears something happened that created a duplicate of every single page on our site; "www.company.com/example" and "company.com/example." Again, according to our MOZ, this is a recent issue. I'm almost certain that prior to a couple of weeks ago, there existed both forms of the URL that directed to the same page without be counting as a duplicate. Thanks for you help!
Intermediate & Advanced SEO | | wzimmer0 -
PDF Cached by Google, but not showing as link
The following pdf is cached by google: http://www.sba.gov/sites/default/files/files/REFERRAL%20LIST%20OF%20BOND%20AGENCIES_Florida.pdf However, OpenSiteExplorer is not listing any of the links as found in it. With such an authoritative site, I would think Google would value this, right? None of the sites listed rank well though and OpenSiteExplorer's inability to see the links makes me wonder if Google provides these sites any value at all. Is there any link juice or brand mention value here for Google?
Intermediate & Advanced SEO | | TheDude0 -
Can you recover from "Unnatural links to your site—impacts links" if you remove them or have they already been discounted?
If Google has already discounted the value of the links and my rankings dropped because in the past these links passed value and now they don't. Is there any reason to remove them? If I do remove them, is there a chance of "recovery" or should I just move forward with my 8 month old blogging/content marketing campaign.
Intermediate & Advanced SEO | | Beastrip0 -
Will an inbound follow link on a site be devalued by an inbound affiliate link on the same site?
Hey guys, quick question I didn't find an answer to online. Scenario: 1. Site A links to Site B. It's a natural, regular, follow-link 2. Site A joins Site B's affiliate program, and adds an affiliate link Question: Does the first, regular follow link get devalued by the second affiliate link? Cheers!
Intermediate & Advanced SEO | | ipancake0 -
Can anyone help me clean up my link profile?
Hi, I've been struggling with my rankings for some 18 months or so. It seems like the plan is to review my back links and remove or disavow the ones that could be causing my site problems. Unfortunately I neither have the time or expertise to perform this task and wonder if there are any freelancers who could work on this project? I look forward to hearing from anyone.
Intermediate & Advanced SEO | | Aikijeff0 -
Bad neighborhood linking - anyone can share experience how significant it can impact rankings?
SEOMoz community, If you have followed our latest Q&A posts you know by now that we have been suffering since the last 8 months from a severe Google penalty we are still trying to resolve. Our international portfolio of sports properties has suffered significant ranking losses across the board. While we have been tediously trying to troubleshoot the problem for a while now we might be up to a hot lead now. We realized that one of the properties outside of our key properties, but are site that our key properties are heavily linking to (+100 outgoing links per property) seems to have received a significant Google penalty in a sense that it has been completely delisted from the Google index and lost all its PageRank (Pr4) While we are buffed to see such sort of delisting, we are hopeful that this might be the core of our experienced issues in the past i.e. that our key properties have been devalued due to heavy linking to a bad neighborhood site. My question two the community are two-fold: Can anyone share any experience if it is indeed considered possible that a high number of external links to one bad neighboorhood domain can cause significant ranking drops in the rank from being top 3 ranked to be ranked at around a 140 for a competetive key word? The busted site has a large set of high quality external links. If we swap domains is there any way to port over any link juice or will the penalty be passed along? If that is the case I assume the best approach would be to reach out to all the link authorities and have tem link to the new domain instead of the busted site? Thanks /Thomas
Intermediate & Advanced SEO | | tomypro0