Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
-
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company.
As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking.
I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot.
What do you think?
-
I can't speak to the BV implementation aspect as I have no experience with it, however I will echo and agree with Takeshi on the other points as they are the best practice scenario
-
BV does provide a newer format for their reviews, if your server allows server side scripting such as PHP. I believe it's called "Cloud SEO". This is the simplest solution.
If you can't run PHP, then I would recommend talking to YourStoreWizards (http://www.yourstorewizards.com/). They provide customized solutions that can automate the data pulling and updating process.
As far as reviews.mysite.com, you want to get that de-indexed as soon as you get the HTML reviews on your site. Otherwise, not only will the subdomain compete with your main site for traffic, bu tall the reviews on your site will be seen as duplicate content.
-
Alright, this is where we are with this. What Takeshi recommended is a work around. Yes, it works, but it takes more man hours to constantly upload the info. If someone wanted to do this more seamlessly, how could we do that? I don't have an answer quite yet (but wanted to post our follow-up in case someone else stumbles upon this Q&A), but we're going to the company with these questions:
- We need someone on the phone that understands SEO and the BV installation on our site being SEO friendly; i.e. not a developer that knows about implementing BV, but an SEO person that understands the difference between cloaking and duplicate content with product pages.
- We want to know how we can get our product reviews on our product pages that can be seen in the html of the page; i.e. I can view source and see the review content there. This is inline with Takeshi's work around, but is there an easier way to do this where it's automatic?
- Having the reviews sent over via javascript when the bot requests the info seems to be inline with cloaking behavior that is considered bad with the search engines.
- We don’t want to add a ~1.5 second delay to getting the info pulled over for the bots to see it, as this will increase our PageSpeed. However, this seems to be the next best solution for getting up-to-date reviews in the code of the product page.
I know, not every tool is perfect, but if there is a gap here, I'd imagine that one of largest companies in review management would be able to tackle this - don't you think?
To me, this feels like our content is being hi-jacked. I have my reviews in iframes (sort of speak) on my product pages, but also at reviews.mysite.com/product-reviews, which is essentially duplicating my product pages... we're competing with ourselves. Is the best fix to NOINDEX that subdomain and not let the reviews be seen at all, or keep the pages up and just compete with ourselves in the SERPs? Or is there an easy way to get those reviews (our reviews) seen on our site from users and bots?
-
Perfect! Thanks again for the follow-up!
-
Yup. Once you have the GWT verification in the header, you should be able to deindex the entire subdomain instantly.
-
That sounds like the answer Takeshi! We were worried about manually doing it because the user wouldn't see their reviews instantly, but with how you're doing it, it doesn't matter, and (like you said) it shouldn't muddy the user experience.
Are you referring to the "Remove URLs" tool in Google Webmaster for deindexing?
-
Yes, we manually bulk upload the HTML reviews every couples weeks or so to keep them fresh. We also had BV noindex the review.subdomain so that it wasn't competing with us in the SERPs (have them add a noindex tag in the header as well as your Google Webmaster Tools verification code, so you can instantly deindex all the pages).
-
Great idea for having a link to an html version. How do you keep those updated? Is it manual? And do you just block the pages that they create over on the review.mysite.com sub-domain?
That is actually where we started looking at fixing things. I see that sub-domain they created as basically competing with our product pages. Why would that ever be the way a business would want to operate their site, it doesn't make sense to do that. But all I keep hearing is name drops of big brands. It's frustrating really.
-
I'm pretty sure that it's structured markup, but I will definitely be double checking before simply guessing on this one! Thanks Alan.
-
We use BazaarVoice reviews for our ecommerce site too. What we do is right below the iframe reviews, we have a link that says "click here to see more reviews". When you click the link, it opens up a div with the html version of the reviews. So similar idea to what you are proposing, but less "cloaky" than a noscript tag, and it doesn't impact user experience much.
BazaarVoice can also do html reviews that are not iframed if you have a server that can handle server side scripting like PHP (which unfortunately our legacy Yahoo store does not).
-
Ah to have 100% guarantees for anything related to SEO.
Alas, that's not the world we live in. However, we can apply critical thinking to each choice and with that, we are more likely to be safe from the wrath of Google.
SO - for this question let's consider the following:
A "Noscript" version of a site is designed first and foremost for people who have scripts turned off, including those who have browsers set up for either security reasons or for visual impairment needs.
So if you provide content within a noscript block that essentially mirrors what visitors get when scripts are turned on, you are not likely in violation of any Google cloaking policy.
Cloaking comes into play when you generate content purely for Googlebot exclusively or Googlebot and Bingbot.
So if the content you are provided via that zip file (which I assume you then need to manually cut and paste into the noscript portion of the code) is pure content and not over-optimized, you can proceed with confidence that you'll be okay.
Where I DO have concern is this:
"The daily snapshot files contain UGC with SEO-friendly markup tags." (emphasis mine). Exactly what do they mean by that specific wording? That's the concern point. Are they referring to proper structured markup for reviews, from Schema.org or at the very least RDFa reviews markup? If not, that would be problematic because only proper "reviews" specific structured markup should wrap around reviews content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search ranking for a term dropped from 1st/2nd to 106th in 3 months
Hello all, Just a couple notes first. I have been advised to be vague on the search term we've dropped on (in case this page ranks higher than our homepage for it). If you search for my name in Google though you should be able to figure out where I work (I'm not the soccer player). While I am looking for an answer, I've also posted this question on a couple other forums (see https://www.webmasterworld.com/google/4934323.htm and https://productforums.google.com/forum/?utm_medium=email&utm_source=footer#!msg/webmasters/AQLD7lywuvo/2zfFRD6oGAAJ) which have thrown up more questions than answers. So I have posted this as a discussion. We've also been told we may have been under a negative SEO attack. We saw in SEMRush a large number of backlinks in October/November/December - at about the same time we disavowed around 1m backlinks (more on this below) but we can't see this reflected in Moz. We just got off a call with someone at Moz to try and work this out and he suggested we post here - so here goes... On 4th October for the search term 'example-term' we dropped from number 2 to number 9 on Google searches (this was confirmed in Google Search Console). We also paid an external SEO consultant to review our site and see why we are dropping on the term 'example-term'. We've implemented everything and we're still dropping, the consultant thinks we may have been penalised in error (as we are a legitimate business and we're not trying to do anything untoward). In search console you could see from the graphs on the term we used to rank 1st and 2nd (you could go back 2 or 3 years and still see this). The thing we do find confusing is that we still rank very highly (if not 1st) for 'example-term + uk' and our brand name - which is very similar to 'example-term'. Timeline of events of changes: 2nd October 2018 midday: Added a CTA using something called Wisepops over the homepage - this was a full screen CTA for people to pledge on a project on our site helping with the tsunami in Indonesia (which may have had render blocking elements on). 4th October: we added a Google MyBusiness page showing our corporate headquarters as being in the UK (we did flag this on the Google MyBusiness forums and both people who responded said adding a MyBusiness page would not affect our drop in rankings). 4th October: dropped from number 2 to number 9 on Google searches (this was confirmed in Google Search Console) 4th October: Removed the Wisepops popup 5th November: Server redirect so anything coming in on / was redirected to a page without a / 12th November: Removed around 200 junk pages (so old pages, test cms pages etc that were live and still indexed). Redirects from any 404s resolved 19th November: Updated site maps and video site maps to reflect new content and remove old content. Reviewed the whole site for duplicate meta tags and titles and updated accordingly with unique ones. Fixed issues in Google Search Console for Google search console for 404 and Mobile usability. Removed embedded YouTube video from homepage. 11th December: Removed old content and content seen as not useful from indexing; 'honey pot' pages, old blog, map pages, user profile pages, project page ‘junk pages which have little SEO value’ (comments, contact project owner, backers, report project) from indexing, added ‘no-follow’ to widgets linking back to us 3rd January 2019: Changed the meta title from to remove 'example-term' (we were concerned it may have been seen as keyword stuffing) 7th January: Disavow file updated to refuse a set of external sites powered by API linking to us (these were sites like example-term.externalsite.co.uk which used to link to us showing projects in local areas - our SEO expert felt may be seen as a ‘link farm’) 11th January: Updated our ‘About us’ page with more relevant content 15th January: Changed homepage title to include 'example-term' again, footer links updated to point to internal pages rather than linking off to Intercom, homepage ordering of link elements on homepage changed (so moving external rating site link further down the page, removing underlines on one item that was not a link, fixed and instance where two h1 tags were used), removed another set of external Subdomains (i.e. https://externalsite.sitename.co.uk) from our system (these were old sites we used to run for different clients which has projects in geographical areas displayed) 18th January: Added the word 'example-term' to key content pages We're at a loss as to why we are still dropping. Please note that the above changes were implemented after we'd been ranking fine for a couple years on the 'example-term' - the changes were to try and address the drop in ranking. Any advice would be greatly appreciated.
White Hat / Black Hat SEO | | Nobody15554510997900 -
Google suddenly stops ranking a page for a "keyword" with same "keyword" in title tag. Low competition.
Hi all, We have released our next version of product called like "software 11", which have thousands of searches every month. So we have just added this same keyword "software 11" as page title suffix to one of the top ranking pages. Obviously this is the page has been added suddenly with "software 11" at page title, multiple header tags and 1 mention in paragraph. Google ranked it for 2 days and suddenly stopped showing this page in entire results for the same keyword we optimised the page for. Why does it happened? Does Google think that we are overdoing with this page and ignoring it? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Not getting any data in Search console
Hi there My website Ranking well, But in Search console it is not Fetching any Data, here is Screenshot http://prntscr.com/d4m2tz , Why i am not getting any report For Clicks, Impressions ?? is there any mistake which is made?? please any body can help out. Thanx,
White Hat / Black Hat SEO | | pooja.verify050 -
Chinese search engine indexation
Hello, I have read that it is vital for a site to be indexed in Chinese search engines that it needs to be hosted in China on a server with a Chinese IP address, is this true? The site in question is a .cn site, hosted in USA currently, but served via CloudFlare (which has locations in China). Any advice on how to rank a Chinese site would be greatly appreciated, including if you know anyone who I can hire to create a Chinese sitemap file to submit to Chinese search engines (and even optimise the site). Many thanks,
White Hat / Black Hat SEO | | uworlds
Mark0 -
Does Backlinks from User Profile Page Still Matter
Though, lot of algorithmic changes, updates have happened. However, backlinks from quality source or trusted sites has hardly lost its importance and thats why Open SIte Explorer or Majestic seo is still there to find quality of backlinks, trust factor and authority factor of the backlinks. I need to know does having backlink (dofollow) from Profile Page - still matters and if yes - will a do follow backlink from Moz Profile Page or any authority site or forum page, having dofollow link from user profile page - does it still count as a mark to authority of your site.
White Hat / Black Hat SEO | | Modi0 -
In (or In-between) 2 cities - and mentioning both cities in title tags
Hi, just wondering what your thoughts are on this one - several businesses I work for are located in in-between places. For example, one is in one city for its address, but in another city's council (/state) area. Another is in a rural area, almost exactly the same distance between 2 cities (about 10 miles either way). Both businesses mention both cities on several pages of their websites, including in title tags (including homepage title tags), and it seems to be working OK in terms of rankings (ie they're ranking well for keyphrases for both cities). Is it acceptable practice to mention both cities in a single title tag though? That's my question. (some of this confusion dates back to UK local authority boundary/name changes, in 2009)
White Hat / Black Hat SEO | | McTaggart0 -
Product Reviews – Link Building Strategy
I own Simply Bags and have been sending sample bags to bloggers as a link building strategy. The following four links are a sample of recent product reviews. http://bit.ly/Mk6Z1t http://bit.ly/Mk6Smq http://bit.ly/Mk7atN http://bit.ly/Mk7wR8 Product reviews were considered a good link building strategy. After Panda & Penguin is Product Reviews still a good strategy? Please comment on the quality of the four sample links. Thanks
White Hat / Black Hat SEO | | b4tv
Bob Shirilla0 -
User comments with page content or as a separate page?
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
White Hat / Black Hat SEO | | Peter2640