Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
-
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company.
As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking.
I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot.
What do you think?
-
I can't speak to the BV implementation aspect as I have no experience with it, however I will echo and agree with Takeshi on the other points as they are the best practice scenario
-
BV does provide a newer format for their reviews, if your server allows server side scripting such as PHP. I believe it's called "Cloud SEO". This is the simplest solution.
If you can't run PHP, then I would recommend talking to YourStoreWizards (http://www.yourstorewizards.com/). They provide customized solutions that can automate the data pulling and updating process.
As far as reviews.mysite.com, you want to get that de-indexed as soon as you get the HTML reviews on your site. Otherwise, not only will the subdomain compete with your main site for traffic, bu tall the reviews on your site will be seen as duplicate content.
-
Alright, this is where we are with this. What Takeshi recommended is a work around. Yes, it works, but it takes more man hours to constantly upload the info. If someone wanted to do this more seamlessly, how could we do that? I don't have an answer quite yet (but wanted to post our follow-up in case someone else stumbles upon this Q&A), but we're going to the company with these questions:
- We need someone on the phone that understands SEO and the BV installation on our site being SEO friendly; i.e. not a developer that knows about implementing BV, but an SEO person that understands the difference between cloaking and duplicate content with product pages.
- We want to know how we can get our product reviews on our product pages that can be seen in the html of the page; i.e. I can view source and see the review content there. This is inline with Takeshi's work around, but is there an easier way to do this where it's automatic?
- Having the reviews sent over via javascript when the bot requests the info seems to be inline with cloaking behavior that is considered bad with the search engines.
- We don’t want to add a ~1.5 second delay to getting the info pulled over for the bots to see it, as this will increase our PageSpeed. However, this seems to be the next best solution for getting up-to-date reviews in the code of the product page.
I know, not every tool is perfect, but if there is a gap here, I'd imagine that one of largest companies in review management would be able to tackle this - don't you think?
To me, this feels like our content is being hi-jacked. I have my reviews in iframes (sort of speak) on my product pages, but also at reviews.mysite.com/product-reviews, which is essentially duplicating my product pages... we're competing with ourselves. Is the best fix to NOINDEX that subdomain and not let the reviews be seen at all, or keep the pages up and just compete with ourselves in the SERPs? Or is there an easy way to get those reviews (our reviews) seen on our site from users and bots?
-
Perfect! Thanks again for the follow-up!
-
Yup. Once you have the GWT verification in the header, you should be able to deindex the entire subdomain instantly.
-
That sounds like the answer Takeshi! We were worried about manually doing it because the user wouldn't see their reviews instantly, but with how you're doing it, it doesn't matter, and (like you said) it shouldn't muddy the user experience.
Are you referring to the "Remove URLs" tool in Google Webmaster for deindexing?
-
Yes, we manually bulk upload the HTML reviews every couples weeks or so to keep them fresh. We also had BV noindex the review.subdomain so that it wasn't competing with us in the SERPs (have them add a noindex tag in the header as well as your Google Webmaster Tools verification code, so you can instantly deindex all the pages).
-
Great idea for having a link to an html version. How do you keep those updated? Is it manual? And do you just block the pages that they create over on the review.mysite.com sub-domain?
That is actually where we started looking at fixing things. I see that sub-domain they created as basically competing with our product pages. Why would that ever be the way a business would want to operate their site, it doesn't make sense to do that. But all I keep hearing is name drops of big brands. It's frustrating really.
-
I'm pretty sure that it's structured markup, but I will definitely be double checking before simply guessing on this one! Thanks Alan.
-
We use BazaarVoice reviews for our ecommerce site too. What we do is right below the iframe reviews, we have a link that says "click here to see more reviews". When you click the link, it opens up a div with the html version of the reviews. So similar idea to what you are proposing, but less "cloaky" than a noscript tag, and it doesn't impact user experience much.
BazaarVoice can also do html reviews that are not iframed if you have a server that can handle server side scripting like PHP (which unfortunately our legacy Yahoo store does not).
-
Ah to have 100% guarantees for anything related to SEO.
Alas, that's not the world we live in. However, we can apply critical thinking to each choice and with that, we are more likely to be safe from the wrath of Google.
SO - for this question let's consider the following:
A "Noscript" version of a site is designed first and foremost for people who have scripts turned off, including those who have browsers set up for either security reasons or for visual impairment needs.
So if you provide content within a noscript block that essentially mirrors what visitors get when scripts are turned on, you are not likely in violation of any Google cloaking policy.
Cloaking comes into play when you generate content purely for Googlebot exclusively or Googlebot and Bingbot.
So if the content you are provided via that zip file (which I assume you then need to manually cut and paste into the noscript portion of the code) is pure content and not over-optimized, you can proceed with confidence that you'll be okay.
Where I DO have concern is this:
"The daily snapshot files contain UGC with SEO-friendly markup tags." (emphasis mine). Exactly what do they mean by that specific wording? That's the concern point. Are they referring to proper structured markup for reviews, from Schema.org or at the very least RDFa reviews markup? If not, that would be problematic because only proper "reviews" specific structured markup should wrap around reviews content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[SEO] Star Ratings -> Review -> Category Page
Hello there, Basically, if you put non-natural star ratings on the category page, like in the attached images, you will get manual ban from google right?
White Hat / Black Hat SEO | | Shanaki
(i know it for sure, cause I had clients with this situation) The real question is:
If I put a form that allows users to write a review about the category products on the category page, for REAL, will google still ban? Any advice? Any example? With respect,
Andrei Irh0O kto4o0 -
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Canonical tags being direct to "page=all" pages for an Ecommerce website
I find it alarming that my client has canonical tags pointing to "page=all" product gallery pages. Some of these product gallery pages have over 100 products and I think this could effect load time, especially for mobile. I would like to get some insight from the community on this, thanks!
White Hat / Black Hat SEO | | JMSCC0 -
Pagination for Search Results Pages: Noindex/Follow, Rel=Canonical, Ajax Best Option?
I have a site with paginated search result pages. What I've done is noindex/follow them and I've placed the rel=canonical tag on page2, page3, page4, etc pointing back to the main/first search result page. These paginated search result pages aren't visible to the user (since I'm not technically selling products, just providing different images to the user), and I've added a text link on the bottom of the first/main search result page that says "click here to load more" and once clicked, it automatically lists more images on the page (ajax). Is this a proper strategy? Also, for a site that does sell products, would simply noindexing/following the search results/paginated pages and placing the canonical tag on the paginated pages pointing back to the main search result page suffice? I would love feedback on if this is a proper method/strategy to keep Google happy. Side question - When the robots go through a page that is noindexed/followed, are they taking into consideration the text on those pages, page titles, meta tags, etc, or are they only worrying about the actual links within that page and passing link juice through them all?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Seo back linking proposal review
Hi guys, below is a proposal i received from someone on freelancer.com for some seo building. Is this really all it takes? Obviously done overtime but basically is this it aside from the usual basics onsite keywords, urls, artciles, content etc. This is a the proposal for $250 (some are cheaper but almost the same details as below). This is one of the top seo people on freelancers.com and they all have good reviews. Is this basically it? Shell out $100 bucks or more a month to someone who will just post stuff all over the internet. It just seems all very simple, what is $100 bucks a month to stay at #1. Is there any real questions i should ask to make sure i am not just throwing my money away? I would like to recommend the following services for attaining better search results for the website. 1)Press Release Submissions
White Hat / Black Hat SEO | | topclass
2)Social bookmarking submissions
3)Drip Feed Article Links - 100 Article submissions everyday for 25 days
4)Article directory submissions
5)Link directory submissions
6)Blog Post Submissions(All Blogs have PR1 to PR6)
7)Wiki Page Submissions(.EDU and .GOV Sites Included) PR of the directories, social bookmarking websites, Blogs, wiki pages and Article directories are from PR0 to PR8.
Most of them are in the range of PR1 to PR4. If you are interested in the above services then these are the details about those services. 1)Press release Submissions - We will write 3 press release and submit them to 25 press release websites.
Submitting press release gets the news to Google news, Yahoo news etc..
Please note we even submit to Paid press release websites like PRBuzz, SBWire, pressdoc etc.. 2)Social Bookmarking submissions - I will also submit your website to 150 social bookmarking websites.
Here are the example of social bookmarking websites.
www.digg.com
www.furl.net
After we finish submitting to social bookmarking websites we will then create rss feeds with approved link URL's and ping them so that links get indexed. 3)Drip Feed Article submissions - We will be writing one article.
Everyday we will submitting the article to 100 different websites.
We will be submitting for 25 days.
100 submissions x 25 days = 2500 submissions.
In each article submissions we can use 2 links to the website. 4)Article directory submissions - We will write 5 articles.
Each article will be around 500 words.
Then we will submit them to 300 different article directories. That means 5 articles x 300 article directories = 1500 article submissions.
In each article we can use 2 links to the website.
1500 x 2 Links.
I have experience in submitting articles to article directories.
Till now i have submitted more than 1000 articles to article directories.
I will also create separate accounts with article directories wherever possible. 5)Link directory submissions - I have a list of 1300 directories.
I will submit your website to these directories.
I have experience in submitting to link directories.
Till now i have submitted more than 2500 websites.
All the submission work is done manually.
All these directories provide one way links. 6)Blog Post Submissions(700 PR1 to PR6 Blogs) - We will write 1 article.
we spin and post to 700 PR1 to PR5 blogs.
We can spin the article, title of article and links
You will be given a confirmation when complete, and a code to search backlinks in the search engines.
They are hosted on 650 different C Class IPs! 7)Wiki Page Submissions - Get 200+ wiki site contextual backlinks (3 per posted article) from a range of PR 0 to 8 wiki sites including over 30 US .EDU and US .GOV sites.
I will also ping Them.0 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0 -
Improve CTR with Special Characters in Meta-Description / Title Tags
I've seen this question asked a few times, but I haven't found a definitive answer. I'm quite surprised no one from Google has addressed the question specifically. I ran across this post the other day and it piqued my interest: http://www.datadial.net/blog/index.php/2011/04/13/special-characters-in-meta-descriptions-the-beboisation-of-google/ If you're able to make your result stand out by using stars, smiley faces, TM symbols, etc it would be a big advantage. This is in use currently if you search for a popular mattress keyword in Google. It really is amazing how the special characters draw your attention to the title. You can also see the TM and Copyright symbols if you search for "Logitech Revue" Radioshack is using these characters in their adwords also. Has anyone found any definitive answers to this? Has anyone tracked CTR and long-term results with special characters in title or description tags? Any chance of getting penalized for using this? As a follow-up, it looks like you could also put check symbols into your meta-description tags. That has all kinds of interesting possibilities. http://www.seosmarty.com/special-symbols-wingdings-for-social-media-branding-twitter-linkedin-google-plus/
White Hat / Black Hat SEO | | inhouseninja0 -
Using Canonical Tags to Boost Lead Form Ranking
I presently have a number of whitepapers that bring traffic to our site. If a visitor elects to download the whitepaper they are taken to a lead form with an abstract of the whitepaper. The abstract is present because the visitor may or may not have come to the lead form directly. I imagine this would be a "no no," but how do you feel about placing a canoncial tag on a whitepaper that points to the lead form w/ abstract? The obvious idea being to take the umph of a whitepaper to direction search visitors directly to the lead form.
White Hat / Black Hat SEO | | shoffy0