Cloaking for better user experience and deeper indexing - grey or black?
-
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands.
The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines.
If we limit the results, then not all of the individual directory listing pages can be found.
I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black?
I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply.
To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines.
This is a positive user experience.
For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page.
This requires cloaking, but is ultimately serving the same content in slightly different ways.
1. Where on the scale of white to black is this?
2. Would you do this for a client's site?
3. Would you do it for your own site?
-
-
I wish I could accurately place this on a scale for you. In my opinion I would consider this to be white hat. You have no intent of manipulating search results here - this is completely a usability issue and this is the obvious fix.
-
Yes, I certainly would
-
yes, I certainly would
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page drops from index completely
We have a page that is ranking organically at #1 but over the past couple of months the page has twice dropped from a search term entirely. There don't appear to be any issues with the page in Search Console and adding the page on https://www.google.com/webmasters/tools/submit-url seems to fix the issue. The search term we're tracking that drops is in the URL for the page and is the h1 of the page. Here is a screenshot of the ranking over the past few months: https://jmp.sh/akvaKGF What could cause this to happen? There is nothing in search console that shows any problems with the page. The last time this happened the page completely dropped on all search terms and showed up again after submitting the url to google manually. This time it dropped on just one search term and reappeared the next day after manually submitting the page again. akvaKGF
White Hat / Black Hat SEO | | russell_ms0 -
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Competitor Black Hat Link Building?
Hello big-brained Moz folks, We recently used Open Site Explorer to compile a list of inbound linking domains to one of our clients, alongside domains linking to a major competitor. This competitor, APBSpeakers.com, is dominating the search results with many #1 rankings for highly competitive phrases, even though their onsite SEO is downright weak. This competitor also has exponentially more links(602k vs. 2.4k) and way more content(indexed pages) reported than any of their competitors, which seems physically impossible to me. Linking root domains are shown as 667 compared to 170 for our client, who has been in business for 10+ years. Taking matters a step further, linking domains for this competitor include such authoritative domains as: Cnn.com TheGuardian.com PBS.org HuffingtonPost.com LATimes.com Time.com CBSNews.com NBCNews.com Princeton.edu People.com Sure, I can see getting a few high profile linking domains but the above seems HIGHLY suspicious to me. Upon further review, I searched CNN, The Guardian and PBS for all variations of this competitors name and domain name and found no immediate mentions of their name. I smell a rat and I suspect APB is using some sort behind-the-scenes programming to make these "links" happen, but I have no idea how. If this isn't the case, they must have a dedicated PR person with EXTREMELY strong connections to secure this links, but even this seems like a stretch. It's conceivable that APB is posting comments on all of the above sites, along with links, however, I was under the impression that all such posts were NoFollow and carried no link juice. Also, paid advertisements on the above sites should be NoFollow as well, right? Anyway, we're trying to get to the bottom of this issue and determine what's going on. If you have any thoughts or words of wisdom to help us compete with these seemingly Black Hat SEO tactics, I'd sure love to hear from you. Thanks for your help. I appreciate it very much. Eric
White Hat / Black Hat SEO | | EricFish0 -
Pharma Hack/Grey hat SEO. Cannot get site to rank, tons of incoming bad links
I have been working on a website trying to get it to show up in the SERPs again. It is being indexed which is great, it has some errors that I'm fixing now. But for the most part it should be ranking. It don't show any penalties going on, but when I did a backlink search we keep getting the cialis, viagra etc inbound links. First thought was Pharma Hack. But it's not a WP site and I recently rebuilt it. So whatever bad code could have been there it's not anymore. It doesn't show up in google either for the search site:www.mysite.com viagra cialis etc... So I'm wondering if anyone has any insight in a direction to point me? I don't understand what would be causing this to still not rank. Only thing it ranks for is it's name. Any suggestions would be very appreciated.
White Hat / Black Hat SEO | | WeBuyCars.com0 -
Why isn't a 301 redirect removing old style URLs from Google's index?
I have two questions:1 - We changed the URL structure of our site. Old URLs were in the format of kiwiforsale.com/used_fruit/yummy_kiwi. These URLs are 301 redirected to kiwiforsale.com/used-fruit/yummy-kiwi. We are getting duplicate content errors in Google Webmaster Tools. Why isn't the 301 redirect removing the old style URL out of Google's index?2 - I tried to remove the old style URL at https://www.google.com/webmasters/tools/removals, however I got the message that "We think the image or web page you're trying to remove hasn't been removed by the site owner. Before Google can remove it from our search results, the site owner needs to take down or update the content."Why are we getting this message? Doesn't the 301 redirect alert Google that the old style URL is toast and it's gone?
White Hat / Black Hat SEO | | CFSSEO0 -
Disabling a slider with content...is considered cloaking?
We have a slider on our site www.cannontrading.com, but the owner didn't like it, so I disabled it. And, each slider contains link & content as well. We had another SEO guy tell me it considered cloaking. Is this True? Please give feedbacks.
White Hat / Black Hat SEO | | ACann0 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
White Hat / Black Hat SEO | | stever9990 -
Trying to determine if my site was de-indexed...
I ran a search using the allinsite:floridainboundmarketing.com command and found that virtually all of my pages are not being returned in the results. I'm one of those who (foolishly) used ALN blog network for a few months, got the unnatural links notice in WMT and on advice of other SEOs (including some here) I ignored it based on the idea that if my SERPS dropped due to alog update that a request for reconsideration was of no value. As I watched my SERPs dropping I was confident that it was simply because those links were no longer being counted and overall link profile was poor, so the results started dropping. I've not read where G has gone back and started de-indexing pages for such sites but it may be happening as (unless I'm wrong) my site is gone... Anyone got any ideas? Am I searching correctly?
White Hat / Black Hat SEO | | sdennison0