Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Googlebot on paywall made with cookies and local storage
-
My question is about paywalls made with cookies and local storage. We are changing a website with free content to a open paywall with a 5 article view weekly limit.
The paywall is made to work with cookies and local storage. The article views are stored to local storage but you have to have your cookies enabled so that you can read the free articles. If you don't have cookies enable we would pass an error page (otherwise the paywall would be easy to bypass).
Can you say how this affects SEO? We would still like that Google would index all article pages that it does now.
Would it be cloaking if we treated Googlebot differently so that when it does not have cookies enabled, it would still be able to index the page?
-
Thank you for your answer!
Yes, that is exactly the case.
We have been testing this and it seems that "Googlebot" doesn't hit the wall at all with the normal settings on. With these results it seems that we don't need to treat "Googlebot" differently because it doesn't seem to store any cookie or local storage data.
Tech savvy users can bypass the pay wall by other means as well so that's not a big concern for us.
-
To make sure that I'm getting your question correct. You want Google to crawl and index all your content but you want visitors to use an open paywall that shows 5 free articles then resorts to a paywall.
Yes, it would be treated as cloaking but you have a legitimate reason for doing so and intent matters a great deal. You could check for a search engine user-agent string such as "Googlebot" and then serve the full content. This would ensure that your content is still crawled and indexed.
The only downside is any tech savvy individual can spoof the server header by setting their user-agent to "Googlebot" and bypass your paywall.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I need help in doing Local SEO
Hey guys I hope everyone is doing well. I am new to SEO world and I want to do local SEO for one of my clients. The issue is I do not know how to do Local SEO at all or where to even start. I would appreciate it if anyone could help me or give me an article or a course to learn how to do it. Main question The thing that I want to do is that, I want my website to show up in top 3 google map results for different locations(which there is one actual location). For example I want to show up for
Intermediate & Advanced SEO | | seopack.org.ofici3
online clothing store in new york
online clothing store in los angeles or... Let's assume that we can ship our product to every other cities. So I hope I could deliver what I mean. I'd appreciate it if you could answer me with practical solutions.0 -
Local SEO - two businesses at same address - best course of action?
Hi Mozzers - I'm working with 2 businesses at the moment, at the same address - the only difference between the two is the phone number. I could ask to split the business addresses apart, so that NAP(name, address, phone number) is different for each businesses (only the postcode will be the same). Or simply carry on at the moment, with the N and Ps different, yet with the As the same - the same addresses for both businesses. I've never experienced this issue before, so I'd value your input. Many thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
What's the best possible URL structure for a local search engine?
Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
Intermediate & Advanced SEO | | _nitman0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
Why would our server return a 301 status code when Googlebot visits from one IP, but a 200 from a different IP?
I have begun a daily process of analyzing a site's Web server log files and have noticed something that seems odd. There are several IP addresses from which Googlebot crawls that our server returns a 301 status code for every request, consistently, day after day. In nearly all cases, these are not URLs that should 301. When Googlebot visits from other IP addresses, the exact same pages are returned with a 200 status code. Is this normal? If so, why? If not, why not? I am concerned that our server returning an inaccurate status code is interfering with the site being effectively crawled as quickly and as often as it might be if this weren't happening. Thanks guys!
Intermediate & Advanced SEO | | danatanseo0 -
Where is the best place to put a sitemap for a site with local content?
I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
How to perform Local SEO for sites like Angies List/Task Rabbit or Craigslist
I have a new SEO client that has a business model similar to Criagslist and Angies List or Task Rabbit, Where they offer local based services nationwide. My first thought was Local link building and citation building etc. But the issue is they are a purely online service company and they don't have a phyiscal address in every city/state they will be offering their services in. What is the best course of action for providing SEO services for this type of business model. I am pretty much at a stand still on how to rank them locally for the areas they provide services in. it's a business model that involves local businesses and customers looking for services from those local businesses.
Intermediate & Advanced SEO | | VITALBGS0 -
Code to change country in URL for locale results
How do I change the code in my URL to search in Google by specific location?
Intermediate & Advanced SEO | | theLotter0