Is there such thing as white hat cloaking?
-
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this?
Thanks!
-
We have the same issue with our site HelloCoin, its pure ajax/javascript so we make a second no javascript version for every page for googlebot to crawl it, we just make it as much as possible similar to the original (user version). Just don't hide anything and show everything as it is, some functionality might not work but its not an issue, google just want to see how it looks for the user not how it works.
-
It is acceptable and completely common. Imagine you had a 100% flash site. The bots can figure out some of the content, but not a lot, so they actually need you to serve up a different version of your site so that they know what's there and can index you properly. As long as the content is the same, it shouldn't be an issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mobile Redirect - Cloaking/Sneaky?
Question since Google is somewhat vague on what they consider mobile "equivalent" content. This is the hand we're dealt with due to budget, no m.dot, etc, responsive/dynamic is on the roadmap but still a couple quarters away but, for now, here's the situation. We have two sets of content and experiences, one for desktop and one for mobile. The problem is that desktop content does not = mobile content. The layout, user experience, images and copy aren't the same across both versions - they are not dramatically different but not identical. In many cases, no mobile equivalent exists. Dev wants to redirect visitors who find the desktop version in mobile search to the equivalent mobile experience, when it exists, when it doesn't they want to redirect to the mobile homepage - which really isn't a homepage it's an unfiltered view of the content. Yeah we have push state in place for the mobile version etc. My concern is that Google will look at this as cloaking, maybe not in the cases where there's a near equivalent piece of content, but definitely when we're redirecting to the "homepage". Not to mention this isn't a great user experience and will impact conversion/engagement metrics which are likely factors Google's algorithm considers. What's the MOZ Community say about this? Cloaking or Not and Why? Thanks!
White Hat / Black Hat SEO | | Jose_R0 -
Recovering from Black Hat/Negative SEO with a twist
Hey everyone, This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you. Scenario In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to. We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages. Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site. Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them. Next Steps? The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are: Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1) Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795) Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? Would love some input or examples from anyone who can help, thanks in advance!
White Hat / Black Hat SEO | | Etna0 -
Two plus two equals four! Grey hat alive and well
Rand is unquestionably much smarter than I however his pronouncements concerning the link building don't seem to hold true for some sectors of the online marketplace. We sell upholstery leather and one of our main competitor runs the table with the most important search terms and has a total garbage backlink profile. I don't know if there is some onsite magic they are working but they don't use brand name anchor text, links are not relevant to their products and most of their links are from high DA blogs, craps posts to .edu forums and no follow. The point is, maybe black hat is out but a lot of what I see being rewarded out there suggests grey hat is alive and well.
White Hat / Black Hat SEO | | leatherhidestore0 -
White Hat/Black Hat: Incentivized SEO Competition?
General Idea: Rules: The winner is the person who ranks highest for "Random Easy to Rank for Key Phrase" Prize: Some cool prize White or Black hat?
White Hat / Black Hat SEO | | LaunchAStartup0 -
What are the best methods of White Hat SEO?
What are the best methods of White Hat SEO? How can you create good quality White Hat links? For example, how do you convince someone to link to your site?
White Hat / Black Hat SEO | | harrygardiner0 -
Is this a white hat SEO tactic?
Hi, I just noticed this website http://www.knobsandhardware.com hosts pages like http://www.knobsandhardware.com/local/hardware/California-Cabinet-Hardware.html that are filled with permutations of products + cities. These pages rank for these long tail phrases. Is this considered white hat?
White Hat / Black Hat SEO | | anthematic0 -
How is this obvious black hat technique working in Google?
Get ready to have your minds blown. Try a search in Google for any of these: proform tour de france tour de france trainer tour de france exercise bike proform tour de france bike In each instance you will notice that Proform.com, the maker of the bike, is not #1. In fact, the same guy is #1 every time, and this is the URL: www.indoorcycleinstructor.com/tour-de-france-indoor-cycling-bike Here's the fun part. Click on that result and guess where you go? Yup, Proform.com. The exact same page ranking right behind it in fact. Actually, this URL first redirects to an affiliate link and that affiliate link redirects to Proform.com. I want to know two things. First, how on earth did they do this? They got to #1 ahead of Proform's own page. How was it done? But the second question is, how have they not been caught? Are they cloaking? How does Google rank a double 301 redirect in the top spot whose end destination is the #2 result? PS- I have a site in this industry and this is how I caught it and why it is of particular interest. Just can't figure out how it was done or why they have not been caught. Not because I plan to copy them, but because I plan to report them to Google but want to have some ammo.
White Hat / Black Hat SEO | | DanDeceuster0 -
What happens if a company only uses black hat techniques for an extended period of time?
Let's say I were to start a company. Of course, I want to be indexed, crawled, and pulled up in the search engines. So I start using black hat seo techniques. I comment spam, keyword stuff, spin articles, hide text, etc. I publish hundreds of articles per day on well know sites with excellent page rank. If I am doing all of these unethical techniques, what is going to happen to my website?
White Hat / Black Hat SEO | | FrontlineMobility0