Cloaking for better user experience and deeper indexing - grey or black?
-
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands.
The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines.
If we limit the results, then not all of the individual directory listing pages can be found.
I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black?
I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply.
To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines.
This is a positive user experience.
For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page.
This requires cloaking, but is ultimately serving the same content in slightly different ways.
1. Where on the scale of white to black is this?
2. Would you do this for a client's site?
3. Would you do it for your own site?
-
-
I wish I could accurately place this on a scale for you. In my opinion I would consider this to be white hat. You have no intent of manipulating search results here - this is completely a usability issue and this is the obvious fix.
-
Yes, I certainly would
-
yes, I certainly would
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difference between White Hat/Black Hat?
Hey guys can you elaborate the difference between the White Hat and Black hat, White Hat: Getting Backlink form the relevant website that's look general and Anchor tag is also look natural. Black Hat: Getting a backlinks from different Niche like Unionwell from High DA website, and getting Link, I realize this is the difference but I need some confirmation may be I'm wrong because I'm newbie in SEO Link Building.
White Hat / Black Hat SEO | | saimkhanna0 -
Disappearing Links Black Hat ?
I have seen reports of Black hat spamming with dodgy links but we have another issue with a clients site. The site had a small number of solid following links about 60 which had been in place for years and in the past few weeks all but those directly under their control have ceased to link. At the same time a very aggressive competitor has entered their market which is owned by the officers of an SEO company. Could it be that they have somehow disavowed the links to the site to damage it how do we find out? there are now just 10 following links?
White Hat / Black Hat SEO | | Eff-Commerce0 -
Seeking Top Notch Marketing Company with experience in growing sites post manual penalty
Does anyone know of a company who has direct experience with growing websites AFTER a manual link penalty has been lifted? Any referrals would be great!
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Is there such thing as white hat cloaking?
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
Duplicate user reviews from hotel based database?
Hello, Just got a new client who has a hotel comparison site, the problem is the reviews and the hotel data is all pulled in from a database, which is shared and used by other website owners. This obviously brings up the issue for duplicate content and panda. I read this post by Dr Pete: http://www.seomoz.org/blog/fat-pandas-and-thin-content but am unsure what steps to take. Any feedback would be much appreciated. Its about 200,000 pages. Thanks Shehzad
White Hat / Black Hat SEO | | shehzad0 -
Methods for getting links to my site indexed?
What are the best practices for getting links to my site indexed in search engines. We have been creating content and acquiring backlinks for the last few months. They are not being found in the back link checkers or in the Open Site Explorer. What are the tricks of the trade for imporiving the time and indexing of these links? I have read about some RSS methods using wordpress sites but that seems a little shady and i am sure google is looking for that now. Look forward to your advice.
White Hat / Black Hat SEO | | devonkrusich0 -
Recovering From Black Hat SEO Tactics
A client recently engaged my service to deliver foundational white hat SEO. Upon site audit, I discovered a tremendous amount of black hat SEO tactics employed by their former SEO company. I'm concerned that the efforts of the old company, including forum spamming, irrelevant backlink development, exploiting code vulnerabilities on BB's and other messy practices, could negatively influence the target site's campaigns for years to come. The site owner handed over hundreds of pages of paperwork from the old company detailing their black hat SEO efforts. The sheer amount of data is insurmountable. I took just one week of reports and tracked back the links to find that 10% of the accounts were banned, 20% tagged as abusive, some of the sites were shut down completely, WOT reports of abusive practices and mentions on BB control programs of blacklisting for the site. My question is simple. How does one mitigate the negative effects of old black hat SEO efforts and move forward with white hat solutions when faced with hundreds of hours of black gunk to clean up. Is there a clean way to eliminate the old efforts without contacting every site administrator and requesting removal of content/profiles? This seems daunting, but my client is a wonderful person who got in over her head, paying for a service that she did not understand. I'd really like to help her succeed. Craig Cook
White Hat / Black Hat SEO | | SEOptPro
http://seoptimization.pro
info@seoptimization.pro0 -
NYT article on JC Penny's black hat campaign
Saw this article on JC Penny receiving a 'manual adjustment' to drop their rankings by 50+ spots: http://www.nytimes.com/2011/02/13/business/13search.html Curious what you guys think they did wrong, and whether or not you are aware of their SEO firm SearchDex? I mean, was it a simple case of low-quality spam links or was there more to it? Anyone study them in OpenSiteExplorer?
White Hat / Black Hat SEO | | scanlin0