Cloaking for better user experience and deeper indexing - grey or black?
-
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands.
The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines.
If we limit the results, then not all of the individual directory listing pages can be found.
I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black?
I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply.
To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines.
This is a positive user experience.
For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page.
This requires cloaking, but is ultimately serving the same content in slightly different ways.
1. Where on the scale of white to black is this?
2. Would you do this for a client's site?
3. Would you do it for your own site?
-
-
I wish I could accurately place this on a scale for you. In my opinion I would consider this to be white hat. You have no intent of manipulating search results here - this is completely a usability issue and this is the obvious fix.
-
Yes, I certainly would
-
yes, I certainly would
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Without prerender.io, is google able to render & index geographical dynamic content?
One section of our website is built as a single page application and serves dynamic content based on geographical location. Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
White Hat / Black Hat SEO | | imjonny1231 -
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Does Lazy Loading Create Indexing Issues of products?
I have store with 5000+ products in one category & i m using Lazy Loading . Does this effects in indexing these 5000 products. as google says they index or read max 1000 links on one page.
White Hat / Black Hat SEO | | innovatebizz0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
New online store and use black hat to bring lots of sales
I have one online store and all the seo rules are follow to increase ranking and sales. Buying a new url a launching a new store ( to sale exactly the same products) is fast, easy and cheap. How about using black hat to this new store? I think I have nothing to loose. Is there something I should know before moving ahead? Launching a new store is very cheap and black hat can be done by one of those overseas company at low prices First thing, this new store should not link to my actual store I guess. Any advice? Thank you, BigBlaze
White Hat / Black Hat SEO | | BigBlaze2050 -
Trying to determine if my site was de-indexed...
I ran a search using the allinsite:floridainboundmarketing.com command and found that virtually all of my pages are not being returned in the results. I'm one of those who (foolishly) used ALN blog network for a few months, got the unnatural links notice in WMT and on advice of other SEOs (including some here) I ignored it based on the idea that if my SERPS dropped due to alog update that a request for reconsideration was of no value. As I watched my SERPs dropping I was confident that it was simply because those links were no longer being counted and overall link profile was poor, so the results started dropping. I've not read where G has gone back and started de-indexing pages for such sites but it may be happening as (unless I'm wrong) my site is gone... Anyone got any ideas? Am I searching correctly?
White Hat / Black Hat SEO | | sdennison0 -
What can i do with it? Black hat in my competitors.
Hi, Here we go, i have a site that is is in first page but in last positon, and i got a competitor that is in first place but his is just duplicate content for every page. He just chage the keyword but still the same content. Really, what can i do, do the same thing, i dont want black hat my site. Do i have to keepping doing my on-page and link building and do not care about him?
White Hat / Black Hat SEO | | Ex20