Cloaking for better user experience and deeper indexing - grey or black?
-
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands.
The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines.
If we limit the results, then not all of the individual directory listing pages can be found.
I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black?
I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply.
To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines.
This is a positive user experience.
For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page.
This requires cloaking, but is ultimately serving the same content in slightly different ways.
1. Where on the scale of white to black is this?
2. Would you do this for a client's site?
3. Would you do it for your own site?
-
-
I wish I could accurately place this on a scale for you. In my opinion I would consider this to be white hat. You have no intent of manipulating search results here - this is completely a usability issue and this is the obvious fix.
-
Yes, I certainly would
-
yes, I certainly would
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor Black Hat Link Building?
Hello big-brained Moz folks, We recently used Open Site Explorer to compile a list of inbound linking domains to one of our clients, alongside domains linking to a major competitor. This competitor, APBSpeakers.com, is dominating the search results with many #1 rankings for highly competitive phrases, even though their onsite SEO is downright weak. This competitor also has exponentially more links(602k vs. 2.4k) and way more content(indexed pages) reported than any of their competitors, which seems physically impossible to me. Linking root domains are shown as 667 compared to 170 for our client, who has been in business for 10+ years. Taking matters a step further, linking domains for this competitor include such authoritative domains as: Cnn.com TheGuardian.com PBS.org HuffingtonPost.com LATimes.com Time.com CBSNews.com NBCNews.com Princeton.edu People.com Sure, I can see getting a few high profile linking domains but the above seems HIGHLY suspicious to me. Upon further review, I searched CNN, The Guardian and PBS for all variations of this competitors name and domain name and found no immediate mentions of their name. I smell a rat and I suspect APB is using some sort behind-the-scenes programming to make these "links" happen, but I have no idea how. If this isn't the case, they must have a dedicated PR person with EXTREMELY strong connections to secure this links, but even this seems like a stretch. It's conceivable that APB is posting comments on all of the above sites, along with links, however, I was under the impression that all such posts were NoFollow and carried no link juice. Also, paid advertisements on the above sites should be NoFollow as well, right? Anyway, we're trying to get to the bottom of this issue and determine what's going on. If you have any thoughts or words of wisdom to help us compete with these seemingly Black Hat SEO tactics, I'd sure love to hear from you. Thanks for your help. I appreciate it very much. Eric
White Hat / Black Hat SEO | | EricFish0 -
Black hat : raising CTR to have better rank in Google
We all know that Google uses click-through-rate (CTR) as one of it is ranking factor. I came up with an idea in my mind. I would like to see if someone saw this idea before or tried it. If you search in Google for the term "SEO" for example. You will see the moz.com website in rank 3. And if you checked the source code you will see that result 3 is linking to this url: https://www.google.com.sa/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDMQFjAC&url=https%3A%2F%2Fmoz.com%2Fbeginners-guide-to-seo&ei=F-pPVaDZBoSp7Abo_IDYAg&usg=AFQjCNEwiTCgNNNWInUJNibqiJCnlqcYtw That url will redirect you to seomoz.com Ok, what if we use linkbucks.com or any other cheap targeted traffic network and have a campaign that sends traffic to the url that I show you. Will that count as traffic from Google so it will increase the CTR from Google?
White Hat / Black Hat SEO | | Mohtaref11 -
Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company. As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking. I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot. What do you think?
White Hat / Black Hat SEO | | eTundra0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Black linking exploitation
Hi all After watching our ranking for some primary keywords drop on Google from page 1 to 20 and then totally off the charts in relatively short period I've recently discovered through moz tools that our website along with other competitor sites are victims to black linking (may have the terminology wrong). Two primary words are anchor linked to our domain (www.solargain.com.au) being sex & b$tch through over 4000 compromised sites - mostly Wordpress - many which are high profile sites. Searching through the source code through half a dozen compromised sites I noticed that competitors are also linked using other derogatory terms, but the patterns indicate batch or clustered processing. The hacker has left some evidence as to whom they are representing as I can see some credible discussion forums which contain negative feedback on one particular supplier also among the links. Although this is pretty good evidence to why our ranking has dropped there are some interesting questions: A) is there any way to rectify the 4000 or so black links, mass removal or other. (Doesn't sound feasible)
White Hat / Black Hat SEO | | mannydog
B) some competitors who dominate organic ranking through better optimization don't seem to be affected or apparently affected as much as our site at least. Which questions how much we are affected as a direct result from this hack.
C) is there action or support for industrial espionage?
D) can you request from google to ignore the inbound links and would they not have a duty of care to do so? I'm fairly new to this ugly side of the Internet and would like to know how to approach recovery and moving forward. Thoughts ideas very welcome. Thanks in advance.0 -
Does Backlinks from User Profile Page Still Matter
Though, lot of algorithmic changes, updates have happened. However, backlinks from quality source or trusted sites has hardly lost its importance and thats why Open SIte Explorer or Majestic seo is still there to find quality of backlinks, trust factor and authority factor of the backlinks. I need to know does having backlink (dofollow) from Profile Page - still matters and if yes - will a do follow backlink from Moz Profile Page or any authority site or forum page, having dofollow link from user profile page - does it still count as a mark to authority of your site.
White Hat / Black Hat SEO | | Modi0 -
User comments with page content or as a separate page?
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
White Hat / Black Hat SEO | | Peter2640 -
Is this Cloaking?
http://www.shopstyle.com/product/sephora-makeup-sephora-collection-glossy-gloss/233883264 This comparison shopping engine url shows googlebot something dramatically different than My frustration is that a comp shop takes retailers content and copies and duplicates it and then uses it to capture traffic and send sales to other retailers other than the original provider of the content. Although this is a javascript function and not explicit bot detection does this qualify as unethical cloaking?
White Hat / Black Hat SEO | | tjgill990