Is there such thing as white hat cloaking?
-
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this?
Thanks!
-
We have the same issue with our site HelloCoin, its pure ajax/javascript so we make a second no javascript version for every page for googlebot to crawl it, we just make it as much as possible similar to the original (user version). Just don't hide anything and show everything as it is, some functionality might not work but its not an issue, google just want to see how it looks for the user not how it works.
-
It is acceptable and completely common. Imagine you had a 100% flash site. The bots can figure out some of the content, but not a lot, so they actually need you to serve up a different version of your site so that they know what's there and can index you properly. As long as the content is the same, it shouldn't be an issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recovering from Black Hat/Negative SEO with a twist
Hey everyone, This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you. Scenario In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to. We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages. Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site. Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them. Next Steps? The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are: Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1) Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795) Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? Would love some input or examples from anyone who can help, thanks in advance!
White Hat / Black Hat SEO | | Etna0 -
What is left ethical? What is working for offpage SEO? Very long write up in here and my take on things.
Hello,
White Hat / Black Hat SEO | | MarketingOfAmerica
Please ignore misspells and grammar, this was typed quickly as I am spending my time researching not writing a perfect book on it. My goal is to find ethical very hard to get links unlike guest posts which are now dead according to Matt Cutt's blog here http://www.mattcutts.com/blog/guest-blogging/. My journey started with a quick message to Rand Fishkin, he responded the following "Hi Matthew - thankfully, there's literally hundreds of link building methodologies that are still completely legit. Check out http://moz.com/blog/category/link-building and you'll find tons and tons of them. The key is that none are easy, none are particularly scalable, and all of them require doing work that will add value for searchers, for your brand, and for your overall marketing - which is exactly what Google wants to count. Wish you all the best," Thanks Rand Fishkin! So I started my search looking for links that are hard to get other than those that are directories, forum links that are dead and spammy, blog comments which are overused, guest posts, or any type of black hat link. I figured I would start to check what other popular SEO companies were doing and that have been at the top through many of the updates. After running an analysis on the term SEO services I found the following Test 1. I analyzed Main Street Host to start with. If you type in SEO services in Google you can see they are rank 1 for it. After a quick analysis it's easy to see that they have 100's of footer links on clients that they have, some with exact match anchors and some without. My question is, is why is this a viable tactic? Lets take for example the following. If you pull up their http://www.opensiteexplorer.org/ stats and look at the inbound links you will come to an exact match anchor right away that says SEO Marketing Company. I went to the weebly link that they have and found that they have put their name at the bottom of this page. Issue 1 - Why is it ok for this type of link, but it's not ok for a template link? Aren't these links suppose to be penalized? Issue 2 - Nothing on the page is even relevant to their link at all. As we have read before, you need to have links surrounding relevant text. Take a look at their backlinks you and you will find almost all of their high quality links are exact match anchors coming from their clients surrounded by irrelevant text. Why is this working? How is this different than a network? What stops someone from just starting a network and dedicating 1 footer link to a full site and putting up dummy info... Anyone can go to Godaddy and purchase a DA 40+ site or so and throw up $20 of content and a footer link. As I dove deeper into finding what is ethical and working I discovered many of the top SEO companies use this. Not just one, but over 20 of them use this same method. Lets use another example. So I started to look at what they did for their clients. How did I know who they worked for? Simple I assume that since they have their link at the bottom of the page and claim that they do SEO for them, they are indeed working for them. So I analyzed the site we talked about a while ago on the Weebly that they had their link on. It's the Valley Art Weebly link if your checking yourself. I quickly found that they are using a network to rank up some of their clients as well. For example http://firesidebookshop.com/index.html Take a look at the link on this page leading to the art place. At first glance the site doesn't look spammy, but try to buy a book, or even order one. Who has an online book store, but doesn't sell books lol? Who also puts interesting links on their home page? This screams network to me. I am willing to bet the following will happen - Matt Cutts and his spam team will ad something like the following to the algorithm or whatever you would like to call it "ignore link if total outbound dofollow links on full site = x amount or higher" = internal Google disavow tool = bye to guest blogging. So what is everyone going to do? Okay it's time to figure out what that number is right? Lets do some tests and lets say that magic number is 5 to 10 links on a whole site. What does this do? This drives the price of quick SEO up again evening the playing field for others using ethical SEO like myself. How do I figure this? Lets face it black hat SEO will never end as long as someone is able to do it. Now since guest posts are gone, the quick link on quality sites surrounded by enough text to count is gone. This means that it will cost extra money, because everyone will be forced to put a max of x amount of links to be safe and for the links to get noticed on a website. So now they have to purchase an established domain that is high enough quality to pass the correct link juice through to a clients site that they want to rank up. Lets figure a few dollars for a unique IP, another few for the hosting, $40 to $100 for the domain if your lucky on Godaddy auctions, and then $40 for the content to make it look realistic if your getting it for $0.01 a word. Plus the time it takes to setup your site. This price of that $30 Odesk guest post backlink just went up to a min of $100 or so. Diving deeper into what's working and moving past the networks, because I feel this will only work temporarily as well if you are brave enough to use this and I know I am not. It doesn't seem to ethical to me at the end of the day even though some may argue, you are just creating more relevant websites which can maximize your traffic streams. The problem is I have stopped here and am stuck. Sure I have looked at http://moz.com/blog/category/link-building and read the most recent post where it talks about 31 types of links. Most of those links don't apply or are outdated and you shouldn't use them. Some of them talk about forum links,directories, bookmarks.. Those have been tactics for years and sure you may find 1 out of 1000 that are good, but the rest are just spam. I have been over to search engine land, and a handful of other sites. I have talked to many other SEO's as well. They are emailing me asking what they should do after guest posts, because they are unsure. The question is, what is ethical? Let say you have a plumber, or a roofer, .gov links are nearly impossible for them and quite frankly that seems spammy to me to even post them on one. I know what many are going to say, build links as if your not worried about Google and you will grow.. Where are you going to build the links to if everything is unethical? As we know clients will walk if they don't see improvements quickly. What's quickly? I would say around the 3 to 6 month period using ethical SEO. Sure there is onpage, a great blog, etc., but what is there left truly ethical for offpage SEO besides some good press releases, some social profile links like a pinterst, and the normal? I must be missing something! I am not looking for the easy way, I am not afraid to get my hands dirty and work hard. If anyone can show me a quick example of a truly ethical link I would be grateful to see this. I can't seem to wrap my head around something that I can do that will last at this point. If you don't want to share it to the world, please PM me. [edited for formatting by Keri Morgret]0 -
Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company. As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking. I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot. What do you think?
White Hat / Black Hat SEO | | eTundra0 -
Is pulling automated news feeds on my home page a bad thing?
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom. After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
White Hat / Black Hat SEO | | amit20760 -
Is using twiends.com to get twitter followers considered black hatting?
Hi, I've been struggling to get followers on Google Plus and Twitter, and recently stumbled upon twiends.com. It offers an easy service that allows you to get twitter followers very quickly. Is this considered black hating? Even if Google doesn't consider the followers as valid, am I likely to be punished if using their service? Even if it doesn't help rankings, it is nice to have lots of followers so that they will see my tweets which has the potential to drive more traffic to my site, and give awareness to my business. What are your thoughts?
White Hat / Black Hat SEO | | eugenecomputergeeks0 -
How is this obvious black hat technique working in Google?
Get ready to have your minds blown. Try a search in Google for any of these: proform tour de france tour de france trainer tour de france exercise bike proform tour de france bike In each instance you will notice that Proform.com, the maker of the bike, is not #1. In fact, the same guy is #1 every time, and this is the URL: www.indoorcycleinstructor.com/tour-de-france-indoor-cycling-bike Here's the fun part. Click on that result and guess where you go? Yup, Proform.com. The exact same page ranking right behind it in fact. Actually, this URL first redirects to an affiliate link and that affiliate link redirects to Proform.com. I want to know two things. First, how on earth did they do this? They got to #1 ahead of Proform's own page. How was it done? But the second question is, how have they not been caught? Are they cloaking? How does Google rank a double 301 redirect in the top spot whose end destination is the #2 result? PS- I have a site in this industry and this is how I caught it and why it is of particular interest. Just can't figure out how it was done or why they have not been caught. Not because I plan to copy them, but because I plan to report them to Google but want to have some ammo.
White Hat / Black Hat SEO | | DanDeceuster0 -
What happens if a company only uses black hat techniques for an extended period of time?
Let's say I were to start a company. Of course, I want to be indexed, crawled, and pulled up in the search engines. So I start using black hat seo techniques. I comment spam, keyword stuff, spin articles, hide text, etc. I publish hundreds of articles per day on well know sites with excellent page rank. If I am doing all of these unethical techniques, what is going to happen to my website?
White Hat / Black Hat SEO | | FrontlineMobility0 -
Turn grey myself or rat on black hat competitors?
When being trashed by a less than white competitor what do you find most effective: lie down with your feet in the air considering a career in gardening? turn grey yourself? rat on them to Google? Phil
White Hat / Black Hat SEO | | Phil_1