Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
-
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company.
As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking.
I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot.
What do you think?
-
I can't speak to the BV implementation aspect as I have no experience with it, however I will echo and agree with Takeshi on the other points as they are the best practice scenario
-
BV does provide a newer format for their reviews, if your server allows server side scripting such as PHP. I believe it's called "Cloud SEO". This is the simplest solution.
If you can't run PHP, then I would recommend talking to YourStoreWizards (http://www.yourstorewizards.com/). They provide customized solutions that can automate the data pulling and updating process.
As far as reviews.mysite.com, you want to get that de-indexed as soon as you get the HTML reviews on your site. Otherwise, not only will the subdomain compete with your main site for traffic, bu tall the reviews on your site will be seen as duplicate content.
-
Alright, this is where we are with this. What Takeshi recommended is a work around. Yes, it works, but it takes more man hours to constantly upload the info. If someone wanted to do this more seamlessly, how could we do that? I don't have an answer quite yet (but wanted to post our follow-up in case someone else stumbles upon this Q&A), but we're going to the company with these questions:
- We need someone on the phone that understands SEO and the BV installation on our site being SEO friendly; i.e. not a developer that knows about implementing BV, but an SEO person that understands the difference between cloaking and duplicate content with product pages.
- We want to know how we can get our product reviews on our product pages that can be seen in the html of the page; i.e. I can view source and see the review content there. This is inline with Takeshi's work around, but is there an easier way to do this where it's automatic?
- Having the reviews sent over via javascript when the bot requests the info seems to be inline with cloaking behavior that is considered bad with the search engines.
- We don’t want to add a ~1.5 second delay to getting the info pulled over for the bots to see it, as this will increase our PageSpeed. However, this seems to be the next best solution for getting up-to-date reviews in the code of the product page.
I know, not every tool is perfect, but if there is a gap here, I'd imagine that one of largest companies in review management would be able to tackle this - don't you think?
To me, this feels like our content is being hi-jacked. I have my reviews in iframes (sort of speak) on my product pages, but also at reviews.mysite.com/product-reviews, which is essentially duplicating my product pages... we're competing with ourselves. Is the best fix to NOINDEX that subdomain and not let the reviews be seen at all, or keep the pages up and just compete with ourselves in the SERPs? Or is there an easy way to get those reviews (our reviews) seen on our site from users and bots?
-
Perfect! Thanks again for the follow-up!
-
Yup. Once you have the GWT verification in the header, you should be able to deindex the entire subdomain instantly.
-
That sounds like the answer Takeshi! We were worried about manually doing it because the user wouldn't see their reviews instantly, but with how you're doing it, it doesn't matter, and (like you said) it shouldn't muddy the user experience.
Are you referring to the "Remove URLs" tool in Google Webmaster for deindexing?
-
Yes, we manually bulk upload the HTML reviews every couples weeks or so to keep them fresh. We also had BV noindex the review.subdomain so that it wasn't competing with us in the SERPs (have them add a noindex tag in the header as well as your Google Webmaster Tools verification code, so you can instantly deindex all the pages).
-
Great idea for having a link to an html version. How do you keep those updated? Is it manual? And do you just block the pages that they create over on the review.mysite.com sub-domain?
That is actually where we started looking at fixing things. I see that sub-domain they created as basically competing with our product pages. Why would that ever be the way a business would want to operate their site, it doesn't make sense to do that. But all I keep hearing is name drops of big brands. It's frustrating really.
-
I'm pretty sure that it's structured markup, but I will definitely be double checking before simply guessing on this one! Thanks Alan.
-
We use BazaarVoice reviews for our ecommerce site too. What we do is right below the iframe reviews, we have a link that says "click here to see more reviews". When you click the link, it opens up a div with the html version of the reviews. So similar idea to what you are proposing, but less "cloaky" than a noscript tag, and it doesn't impact user experience much.
BazaarVoice can also do html reviews that are not iframed if you have a server that can handle server side scripting like PHP (which unfortunately our legacy Yahoo store does not).
-
Ah to have 100% guarantees for anything related to SEO.
Alas, that's not the world we live in. However, we can apply critical thinking to each choice and with that, we are more likely to be safe from the wrath of Google.
SO - for this question let's consider the following:
A "Noscript" version of a site is designed first and foremost for people who have scripts turned off, including those who have browsers set up for either security reasons or for visual impairment needs.
So if you provide content within a noscript block that essentially mirrors what visitors get when scripts are turned on, you are not likely in violation of any Google cloaking policy.
Cloaking comes into play when you generate content purely for Googlebot exclusively or Googlebot and Bingbot.
So if the content you are provided via that zip file (which I assume you then need to manually cut and paste into the noscript portion of the code) is pure content and not over-optimized, you can proceed with confidence that you'll be okay.
Where I DO have concern is this:
"The daily snapshot files contain UGC with SEO-friendly markup tags." (emphasis mine). Exactly what do they mean by that specific wording? That's the concern point. Are they referring to proper structured markup for reviews, from Schema.org or at the very least RDFa reviews markup? If not, that would be problematic because only proper "reviews" specific structured markup should wrap around reviews content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound links to internal search with pharma spam anchor text. Negative seo attack
Suddenly in October I had a spike on inbound links from forums and spams sites. Each one had setup hundreds of links. The links goes to WordPress internal search. Example: mysite.com/es/?s=⚄
White Hat / Black Hat SEO | | Arlinaite470 -
Are online tools considered thin content?
My website has a number of simple converters. For example, this one converts spaces to commas
White Hat / Black Hat SEO | | ConvertTown
https://convert.town/replace-spaces-with-commas Now, obviously there are loads of different variations I could create of this:
Replace spaces with semicolons
Replace semicolons with tabs
Replace fullstops with commas Similarly with files:
JSON to XML
XML to PDF
JPG to PNG
JPG to TIF
JPG to PDF
(and thousands more) If somoene types one of those into Google, they will be happy because they can immediately use the tool they were hunting for. It is obvious what these pages do so I do not want to clutter the page up with unnecessary content. However, would these be considered doorway pages or thin content or would it be acceptable (from an SEO perspective) to generate 1000s of pages based on all the permutations?1 -
Does type of hosting affect SEO rankings?
Hello, I was wondering if hosting on shared, versus VPS, versus dedicated ... matter at all in terms of the rankings of Web sites ... given that all other factors would be exactly equal. I know this is a big question with many variables, but mainly I am wondering if, for example, it is more the risk of resource usage which may take a site down if too much traffic and therefore make it un-crawlable if it happens at the moment that a bot is trying to index the site (factoring out the UX of a downed site). Any and all comments are greatly appreciated! Best regards,
White Hat / Black Hat SEO | | uworlds
Mark0 -
Google's Related Searches - Optimizing Possible?
Does anyone know how Google determines what suggestions show up at the bottom of SERPs? I've been working with a client to boost his local ranking, but every time we do a branded search for his business his competitors keep popping up in the "Searches related to ______" section.
White Hat / Black Hat SEO | | mtwelves0 -
Ask Bloggers/Users To Link To Website
I have a web service that help bloggers to do certain tasks and find different partners. We have a couple of thousand bloggers using the service and ofcourse this is a great resource for us to build links from. The bloggers are all from different platforms and domains. Currently when a blogger login to the service we tell the blogger that if they write a blog post about us with their own words, and tell their readers what they think of our service. We will then give them a certain benifit within the service. This is clearly encouraging a dofollow-link from the bloggers, and therefore it's not natural link building. The strategy is however working quite good with about 150 new blog posts about our service per month, which both gives us a lot of new visitors and users, but also give us link power to increase our rankings within the SERP. Now to my questions: This is not a natural way of building links, but what is your opinion of this? Is this total black hat and should we be scared of a severe punishment from Google? We are not leaving any footprints more than we are asking the users for a link, and all blogposts are created with their own unique words and honest opinions. Since this viral marketing method is working great, we have no plans of changing our strategy. But what should we avoid and what steps should we take to ensure that we won't get in any trouble in the future for encouraging our users to linking back to us in this manner?
White Hat / Black Hat SEO | | marcuslind0 -
Why Does a Press Release Strategy Influence Search Results So Much?
Why do press release websites, and their submissions, carry so much influence with organic search results right now?
White Hat / Black Hat SEO | | steelintheair1 -
SEO best practice: Use tags for SEO purpose? To add or not to add to Sitemap?
Hi Moz community, New to the Moz community and hopefully first post/comment of many to come. I am somewhat new to the industry and have a question that I would like to ask and get your opinions on. It is most likely something that is a very simple answer, but here goes: I have a website that is for a local moving company (so small amounts of traffic and very few pages) that was built on Wordpress... I was told when I first started that I should create tags for some of the cities serviced in the area. I did so and tagged the first blog post to each tag. Turned out to be about 12-15 tags, which in turn created 12-15 additional pages. These tags are listed in the footer area of each page. There are less than 20 pages in the website excluding the tags. Now, I know that each of these pages are showing as duplicate content. To me, this just does not seem like best practices to me. For someone quite new to the industry, what would you suggest I do in order to best deal with this situation. Should I even keep the tags? Should I keep and not index? Should I add/remove from site map? Thanks in advance for any help and I look forward to being a long time member of SEOMoz.
White Hat / Black Hat SEO | | BWrightTLM0 -
Reviewing a competitors links
Using Open Site Explorer I was reviewing a sites links. This site happens to appear at position 2 in Google for a key-term that I am targeting for one of my sites. Most, if not all of the links appear to be coming from some very questionable sources that have absolutely nothing to do with their sites content or business. Some of the page titles are : Free Music - Free Music Tampa Bay Florida Fishing Guide Free BDSM and Bondage Sex, BDSM XXX, Fetish Por... LAX Car Rental Reciprical Links Page - Add Your Link Casino More Links Is this practice going to end up hurting their site and catch up to them at some point? From what I have read, these are not the type of links that you want to be going after.
White Hat / Black Hat SEO | | BrandonC-2698870