Does it fall under cloaking in pagination?
-
When i am trying to implement rel=next and prev tag in my pages and due to prefetching feature of firefox browser some how more calls are coming to my server for one page and its effecting my page performance.
Solution that i can think of is
1. Increase my server capacity to handle it smoothly - not possible to invest for this change
2. Show this tags only when bot crawls the pages and not when user is coming through browser.
My question is does option 2 fall under cloaking ?
-
This URL contains some advanced tricks to specifically prevent prefetching by Firefox. I've only tried to the htaccess mod_rewrite technique. However, I modified that technique to send prefetch attempts to an empty file instead of the normal 404 page (saving resources): http://www.petefreitag.com/item/312.cfm
I would avoid only showing tags to Googlebot. It does look a little spammy and like cloaking but more than that, adjusting content for Google can be an involved coding process that comes with risks (accidentally showing something else to Google, etc.).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help! Is this what is called "cloaking"?
Friend asked me to look at her website. Ran it through screaming frog and BAM, instead of 4 pages i was expecting it returned HUNDREDS. 99.9% of them are for cheap viagra and pharmaceuticals. I asked her if she was selling viagra, which is fine, I don't judge. But she swears she isn't. http://janeflahertyesq.com I ran it through google site:janeflahertyesq.com and sure enough, if you click on some of those, they take you to canadien pharmacys selling half priced blue pills. a) is this cloaking? if not, what is going on? b) more importantly, how do I we get rid of those hundreds of pages / de-indexed She's stumped and scared. Any help would be greatly appreciated. Thank you all in advance and for the work you do.
White Hat / Black Hat SEO | | TeamPandoraBeauty0 -
Mobile Redirect - Cloaking/Sneaky?
Question since Google is somewhat vague on what they consider mobile "equivalent" content. This is the hand we're dealt with due to budget, no m.dot, etc, responsive/dynamic is on the roadmap but still a couple quarters away but, for now, here's the situation. We have two sets of content and experiences, one for desktop and one for mobile. The problem is that desktop content does not = mobile content. The layout, user experience, images and copy aren't the same across both versions - they are not dramatically different but not identical. In many cases, no mobile equivalent exists. Dev wants to redirect visitors who find the desktop version in mobile search to the equivalent mobile experience, when it exists, when it doesn't they want to redirect to the mobile homepage - which really isn't a homepage it's an unfiltered view of the content. Yeah we have push state in place for the mobile version etc. My concern is that Google will look at this as cloaking, maybe not in the cases where there's a near equivalent piece of content, but definitely when we're redirecting to the "homepage". Not to mention this isn't a great user experience and will impact conversion/engagement metrics which are likely factors Google's algorithm considers. What's the MOZ Community say about this? Cloaking or Not and Why? Thanks!
White Hat / Black Hat SEO | | Jose_R0 -
Cloaking - is this still working ? And how ?
Hello, Recently i read about all the cloaking world.
White Hat / Black Hat SEO | | WayneRooney
I search some information on the internet about it and i fine this service : http://justcloakit.com/.
Since I'm pretty new to whole this "cloaking world" so I have a few questions from from experts in this field. Is this still working on SEO since all the Google update recently ?
How easy is that for someone that don't have much experience and knowledge on php and servers stuff ?
Is there are more sites such as the above example ? In general i have the budget and i don't think its very hard to learn all the technical part but i just want to know if this is something
that still working, is that good investment in your opinion ? (As its not really cheap) Cheers and thank you for your help0 -
Cloaking/Malicious Code
Does anybody have any experience with software for identifying this sort of thing? I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code. Thanks everybody!
White Hat / Black Hat SEO | | HashtagHustler0 -
Cloaking for better user experience and deeper indexing - grey or black?
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands. The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines. If we limit the results, then not all of the individual directory listing pages can be found. I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black? I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply. To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines. This is a positive user experience. For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page. This requires cloaking, but is ultimately serving the same content in slightly different ways. 1. Where on the scale of white to black is this? 2. Would you do this for a client's site? 3. Would you do it for your own site?
White Hat / Black Hat SEO | | ServiceCrowd_AU0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Dramatic fall in SERP's for all keywords at end of March 2012?? Help!
Hi, Our website www.photoworld.co.uk has been improving it's SERP's for the last 12 months or so, achieving page 1 rankings for most of our key terms. Then suddenly, around the end of March, we suffered massive drops in nearly all of our key terms (see attached image for more info). Basically I wondered if anyone had any clues on what Google has suddenly taken a huge dislike to with our site and steps we can put in place to aid with rankings recovery ASAP. Thanks n8taO.jpg
White Hat / Black Hat SEO | | cewe0 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0