Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
-
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company.
As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking.
I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot.
What do you think?
-
I can't speak to the BV implementation aspect as I have no experience with it, however I will echo and agree with Takeshi on the other points as they are the best practice scenario
-
BV does provide a newer format for their reviews, if your server allows server side scripting such as PHP. I believe it's called "Cloud SEO". This is the simplest solution.
If you can't run PHP, then I would recommend talking to YourStoreWizards (http://www.yourstorewizards.com/). They provide customized solutions that can automate the data pulling and updating process.
As far as reviews.mysite.com, you want to get that de-indexed as soon as you get the HTML reviews on your site. Otherwise, not only will the subdomain compete with your main site for traffic, bu tall the reviews on your site will be seen as duplicate content.
-
Alright, this is where we are with this. What Takeshi recommended is a work around. Yes, it works, but it takes more man hours to constantly upload the info. If someone wanted to do this more seamlessly, how could we do that? I don't have an answer quite yet (but wanted to post our follow-up in case someone else stumbles upon this Q&A), but we're going to the company with these questions:
- We need someone on the phone that understands SEO and the BV installation on our site being SEO friendly; i.e. not a developer that knows about implementing BV, but an SEO person that understands the difference between cloaking and duplicate content with product pages.
- We want to know how we can get our product reviews on our product pages that can be seen in the html of the page; i.e. I can view source and see the review content there. This is inline with Takeshi's work around, but is there an easier way to do this where it's automatic?
- Having the reviews sent over via javascript when the bot requests the info seems to be inline with cloaking behavior that is considered bad with the search engines.
- We don’t want to add a ~1.5 second delay to getting the info pulled over for the bots to see it, as this will increase our PageSpeed. However, this seems to be the next best solution for getting up-to-date reviews in the code of the product page.
I know, not every tool is perfect, but if there is a gap here, I'd imagine that one of largest companies in review management would be able to tackle this - don't you think?
To me, this feels like our content is being hi-jacked. I have my reviews in iframes (sort of speak) on my product pages, but also at reviews.mysite.com/product-reviews, which is essentially duplicating my product pages... we're competing with ourselves. Is the best fix to NOINDEX that subdomain and not let the reviews be seen at all, or keep the pages up and just compete with ourselves in the SERPs? Or is there an easy way to get those reviews (our reviews) seen on our site from users and bots?
-
Perfect! Thanks again for the follow-up!
-
Yup. Once you have the GWT verification in the header, you should be able to deindex the entire subdomain instantly.
-
That sounds like the answer Takeshi! We were worried about manually doing it because the user wouldn't see their reviews instantly, but with how you're doing it, it doesn't matter, and (like you said) it shouldn't muddy the user experience.
Are you referring to the "Remove URLs" tool in Google Webmaster for deindexing?
-
Yes, we manually bulk upload the HTML reviews every couples weeks or so to keep them fresh. We also had BV noindex the review.subdomain so that it wasn't competing with us in the SERPs (have them add a noindex tag in the header as well as your Google Webmaster Tools verification code, so you can instantly deindex all the pages).
-
Great idea for having a link to an html version. How do you keep those updated? Is it manual? And do you just block the pages that they create over on the review.mysite.com sub-domain?
That is actually where we started looking at fixing things. I see that sub-domain they created as basically competing with our product pages. Why would that ever be the way a business would want to operate their site, it doesn't make sense to do that. But all I keep hearing is name drops of big brands. It's frustrating really.
-
I'm pretty sure that it's structured markup, but I will definitely be double checking before simply guessing on this one! Thanks Alan.
-
We use BazaarVoice reviews for our ecommerce site too. What we do is right below the iframe reviews, we have a link that says "click here to see more reviews". When you click the link, it opens up a div with the html version of the reviews. So similar idea to what you are proposing, but less "cloaky" than a noscript tag, and it doesn't impact user experience much.
BazaarVoice can also do html reviews that are not iframed if you have a server that can handle server side scripting like PHP (which unfortunately our legacy Yahoo store does not).
-
Ah to have 100% guarantees for anything related to SEO.
Alas, that's not the world we live in. However, we can apply critical thinking to each choice and with that, we are more likely to be safe from the wrath of Google.
SO - for this question let's consider the following:
A "Noscript" version of a site is designed first and foremost for people who have scripts turned off, including those who have browsers set up for either security reasons or for visual impairment needs.
So if you provide content within a noscript block that essentially mirrors what visitors get when scripts are turned on, you are not likely in violation of any Google cloaking policy.
Cloaking comes into play when you generate content purely for Googlebot exclusively or Googlebot and Bingbot.
So if the content you are provided via that zip file (which I assume you then need to manually cut and paste into the noscript portion of the code) is pure content and not over-optimized, you can proceed with confidence that you'll be okay.
Where I DO have concern is this:
"The daily snapshot files contain UGC with SEO-friendly markup tags." (emphasis mine). Exactly what do they mean by that specific wording? That's the concern point. Are they referring to proper structured markup for reviews, from Schema.org or at the very least RDFa reviews markup? If not, that would be problematic because only proper "reviews" specific structured markup should wrap around reviews content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical tag On Each Page With Same Page URL - Its Harmful For SEO or Not?
Hi. I have an e-commerce project and they have canonical code in each and every page for it's own URL. (Canonical on Original Page No duplicate page) The url of my wesite is like this: "https://www.website.com/products/produt1"
White Hat / Black Hat SEO | | HuptechWebseo
and the site is having canonical code like this: " This is occurring in each and every products as well as every pages of my website. Now, my question is that "is it harmful for the SEO?" Or "should I remove this tags from all pages?" Is that any benefit for using the canonical tag for the same URL (Original URL)?0 -
Mobile Redirect - Cloaking/Sneaky?
Question since Google is somewhat vague on what they consider mobile "equivalent" content. This is the hand we're dealt with due to budget, no m.dot, etc, responsive/dynamic is on the roadmap but still a couple quarters away but, for now, here's the situation. We have two sets of content and experiences, one for desktop and one for mobile. The problem is that desktop content does not = mobile content. The layout, user experience, images and copy aren't the same across both versions - they are not dramatically different but not identical. In many cases, no mobile equivalent exists. Dev wants to redirect visitors who find the desktop version in mobile search to the equivalent mobile experience, when it exists, when it doesn't they want to redirect to the mobile homepage - which really isn't a homepage it's an unfiltered view of the content. Yeah we have push state in place for the mobile version etc. My concern is that Google will look at this as cloaking, maybe not in the cases where there's a near equivalent piece of content, but definitely when we're redirecting to the "homepage". Not to mention this isn't a great user experience and will impact conversion/engagement metrics which are likely factors Google's algorithm considers. What's the MOZ Community say about this? Cloaking or Not and Why? Thanks!
White Hat / Black Hat SEO | | Jose_R0 -
Cloaking/Malicious Code
Does anybody have any experience with software for identifying this sort of thing? I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code. Thanks everybody!
White Hat / Black Hat SEO | | HashtagHustler0 -
Soliciting Product Reviews with Free Samples?
I have been looking at my competitors links and I have discovered that a competitor with top positions in the SERPs has been gaining links by offering free product samples to bloggers in exchange for links within the review back to their site. My question is, does Google frown on this? Can it invoke a penalty? To me it seems tantamount to buying links, but yet his results speak for themselves. It is something I intend to start doing myself if I am sure it won't result in a penalty. Thanks.
White Hat / Black Hat SEO | | RocketBanner0 -
Cloaking for better user experience and deeper indexing - grey or black?
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands. The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines. If we limit the results, then not all of the individual directory listing pages can be found. I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black? I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply. To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines. This is a positive user experience. For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page. This requires cloaking, but is ultimately serving the same content in slightly different ways. 1. Where on the scale of white to black is this? 2. Would you do this for a client's site? 3. Would you do it for your own site?
White Hat / Black Hat SEO | | ServiceCrowd_AU0 -
Pagination for Search Results Pages: Noindex/Follow, Rel=Canonical, Ajax Best Option?
I have a site with paginated search result pages. What I've done is noindex/follow them and I've placed the rel=canonical tag on page2, page3, page4, etc pointing back to the main/first search result page. These paginated search result pages aren't visible to the user (since I'm not technically selling products, just providing different images to the user), and I've added a text link on the bottom of the first/main search result page that says "click here to load more" and once clicked, it automatically lists more images on the page (ajax). Is this a proper strategy? Also, for a site that does sell products, would simply noindexing/following the search results/paginated pages and placing the canonical tag on the paginated pages pointing back to the main search result page suffice? I would love feedback on if this is a proper method/strategy to keep Google happy. Side question - When the robots go through a page that is noindexed/followed, are they taking into consideration the text on those pages, page titles, meta tags, etc, or are they only worrying about the actual links within that page and passing link juice through them all?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Any e-commerce users recommend an SEO company for link building?
I manage an e-commerce site. I wanted to know if anyone has worked with an SEO company for link-building that they would recommend. I DO NOT want articled directories, bookmarks, etc.. I want real link-building from credible/related sites. If you would give me an idea of the results or the general process they use I would greatly appreciate it. Thank you in advance.
White Hat / Black Hat SEO | | inhouseseo0 -
Are paid reviews gray/black hat?
Are sites like ReviewMe or PayPerPost white hat? Are follow links allowed within the post? Should I use those aforementioned services, or cold contact high authority sites within my niche?
White Hat / Black Hat SEO | | 10JQKAs0