Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
-
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company.
As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking.
I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot.
What do you think?
-
I can't speak to the BV implementation aspect as I have no experience with it, however I will echo and agree with Takeshi on the other points as they are the best practice scenario
-
BV does provide a newer format for their reviews, if your server allows server side scripting such as PHP. I believe it's called "Cloud SEO". This is the simplest solution.
If you can't run PHP, then I would recommend talking to YourStoreWizards (http://www.yourstorewizards.com/). They provide customized solutions that can automate the data pulling and updating process.
As far as reviews.mysite.com, you want to get that de-indexed as soon as you get the HTML reviews on your site. Otherwise, not only will the subdomain compete with your main site for traffic, bu tall the reviews on your site will be seen as duplicate content.
-
Alright, this is where we are with this. What Takeshi recommended is a work around. Yes, it works, but it takes more man hours to constantly upload the info. If someone wanted to do this more seamlessly, how could we do that? I don't have an answer quite yet (but wanted to post our follow-up in case someone else stumbles upon this Q&A), but we're going to the company with these questions:
- We need someone on the phone that understands SEO and the BV installation on our site being SEO friendly; i.e. not a developer that knows about implementing BV, but an SEO person that understands the difference between cloaking and duplicate content with product pages.
- We want to know how we can get our product reviews on our product pages that can be seen in the html of the page; i.e. I can view source and see the review content there. This is inline with Takeshi's work around, but is there an easier way to do this where it's automatic?
- Having the reviews sent over via javascript when the bot requests the info seems to be inline with cloaking behavior that is considered bad with the search engines.
- We don’t want to add a ~1.5 second delay to getting the info pulled over for the bots to see it, as this will increase our PageSpeed. However, this seems to be the next best solution for getting up-to-date reviews in the code of the product page.
I know, not every tool is perfect, but if there is a gap here, I'd imagine that one of largest companies in review management would be able to tackle this - don't you think?
To me, this feels like our content is being hi-jacked. I have my reviews in iframes (sort of speak) on my product pages, but also at reviews.mysite.com/product-reviews, which is essentially duplicating my product pages... we're competing with ourselves. Is the best fix to NOINDEX that subdomain and not let the reviews be seen at all, or keep the pages up and just compete with ourselves in the SERPs? Or is there an easy way to get those reviews (our reviews) seen on our site from users and bots?
-
Perfect! Thanks again for the follow-up!
-
Yup. Once you have the GWT verification in the header, you should be able to deindex the entire subdomain instantly.
-
That sounds like the answer Takeshi! We were worried about manually doing it because the user wouldn't see their reviews instantly, but with how you're doing it, it doesn't matter, and (like you said) it shouldn't muddy the user experience.
Are you referring to the "Remove URLs" tool in Google Webmaster for deindexing?
-
Yes, we manually bulk upload the HTML reviews every couples weeks or so to keep them fresh. We also had BV noindex the review.subdomain so that it wasn't competing with us in the SERPs (have them add a noindex tag in the header as well as your Google Webmaster Tools verification code, so you can instantly deindex all the pages).
-
Great idea for having a link to an html version. How do you keep those updated? Is it manual? And do you just block the pages that they create over on the review.mysite.com sub-domain?
That is actually where we started looking at fixing things. I see that sub-domain they created as basically competing with our product pages. Why would that ever be the way a business would want to operate their site, it doesn't make sense to do that. But all I keep hearing is name drops of big brands. It's frustrating really.
-
I'm pretty sure that it's structured markup, but I will definitely be double checking before simply guessing on this one! Thanks Alan.
-
We use BazaarVoice reviews for our ecommerce site too. What we do is right below the iframe reviews, we have a link that says "click here to see more reviews". When you click the link, it opens up a div with the html version of the reviews. So similar idea to what you are proposing, but less "cloaky" than a noscript tag, and it doesn't impact user experience much.
BazaarVoice can also do html reviews that are not iframed if you have a server that can handle server side scripting like PHP (which unfortunately our legacy Yahoo store does not).
-
Ah to have 100% guarantees for anything related to SEO.
Alas, that's not the world we live in. However, we can apply critical thinking to each choice and with that, we are more likely to be safe from the wrath of Google.
SO - for this question let's consider the following:
A "Noscript" version of a site is designed first and foremost for people who have scripts turned off, including those who have browsers set up for either security reasons or for visual impairment needs.
So if you provide content within a noscript block that essentially mirrors what visitors get when scripts are turned on, you are not likely in violation of any Google cloaking policy.
Cloaking comes into play when you generate content purely for Googlebot exclusively or Googlebot and Bingbot.
So if the content you are provided via that zip file (which I assume you then need to manually cut and paste into the noscript portion of the code) is pure content and not over-optimized, you can proceed with confidence that you'll be okay.
Where I DO have concern is this:
"The daily snapshot files contain UGC with SEO-friendly markup tags." (emphasis mine). Exactly what do they mean by that specific wording? That's the concern point. Are they referring to proper structured markup for reviews, from Schema.org or at the very least RDFa reviews markup? If not, that would be problematic because only proper "reviews" specific structured markup should wrap around reviews content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search ranking for a term dropped from 1st/2nd to 106th in 3 months
Hello all, Just a couple notes first. I have been advised to be vague on the search term we've dropped on (in case this page ranks higher than our homepage for it). If you search for my name in Google though you should be able to figure out where I work (I'm not the soccer player). While I am looking for an answer, I've also posted this question on a couple other forums (see https://www.webmasterworld.com/google/4934323.htm and https://productforums.google.com/forum/?utm_medium=email&utm_source=footer#!msg/webmasters/AQLD7lywuvo/2zfFRD6oGAAJ) which have thrown up more questions than answers. So I have posted this as a discussion. We've also been told we may have been under a negative SEO attack. We saw in SEMRush a large number of backlinks in October/November/December - at about the same time we disavowed around 1m backlinks (more on this below) but we can't see this reflected in Moz. We just got off a call with someone at Moz to try and work this out and he suggested we post here - so here goes... On 4th October for the search term 'example-term' we dropped from number 2 to number 9 on Google searches (this was confirmed in Google Search Console). We also paid an external SEO consultant to review our site and see why we are dropping on the term 'example-term'. We've implemented everything and we're still dropping, the consultant thinks we may have been penalised in error (as we are a legitimate business and we're not trying to do anything untoward). In search console you could see from the graphs on the term we used to rank 1st and 2nd (you could go back 2 or 3 years and still see this). The thing we do find confusing is that we still rank very highly (if not 1st) for 'example-term + uk' and our brand name - which is very similar to 'example-term'. Timeline of events of changes: 2nd October 2018 midday: Added a CTA using something called Wisepops over the homepage - this was a full screen CTA for people to pledge on a project on our site helping with the tsunami in Indonesia (which may have had render blocking elements on). 4th October: we added a Google MyBusiness page showing our corporate headquarters as being in the UK (we did flag this on the Google MyBusiness forums and both people who responded said adding a MyBusiness page would not affect our drop in rankings). 4th October: dropped from number 2 to number 9 on Google searches (this was confirmed in Google Search Console) 4th October: Removed the Wisepops popup 5th November: Server redirect so anything coming in on / was redirected to a page without a / 12th November: Removed around 200 junk pages (so old pages, test cms pages etc that were live and still indexed). Redirects from any 404s resolved 19th November: Updated site maps and video site maps to reflect new content and remove old content. Reviewed the whole site for duplicate meta tags and titles and updated accordingly with unique ones. Fixed issues in Google Search Console for Google search console for 404 and Mobile usability. Removed embedded YouTube video from homepage. 11th December: Removed old content and content seen as not useful from indexing; 'honey pot' pages, old blog, map pages, user profile pages, project page ‘junk pages which have little SEO value’ (comments, contact project owner, backers, report project) from indexing, added ‘no-follow’ to widgets linking back to us 3rd January 2019: Changed the meta title from to remove 'example-term' (we were concerned it may have been seen as keyword stuffing) 7th January: Disavow file updated to refuse a set of external sites powered by API linking to us (these were sites like example-term.externalsite.co.uk which used to link to us showing projects in local areas - our SEO expert felt may be seen as a ‘link farm’) 11th January: Updated our ‘About us’ page with more relevant content 15th January: Changed homepage title to include 'example-term' again, footer links updated to point to internal pages rather than linking off to Intercom, homepage ordering of link elements on homepage changed (so moving external rating site link further down the page, removing underlines on one item that was not a link, fixed and instance where two h1 tags were used), removed another set of external Subdomains (i.e. https://externalsite.sitename.co.uk) from our system (these were old sites we used to run for different clients which has projects in geographical areas displayed) 18th January: Added the word 'example-term' to key content pages We're at a loss as to why we are still dropping. Please note that the above changes were implemented after we'd been ranking fine for a couple years on the 'example-term' - the changes were to try and address the drop in ranking. Any advice would be greatly appreciated.
White Hat / Black Hat SEO | | Nobody15554510997900 -
Crawl anamoly issue on Search Console
Has anyone checked the crwal anamoly issue under the index section on Search console? We recently move to a new site and I'm seeing a huge list of excluded urls which are classified as crawl anamoly (they all lead to 404 page). Does anyone know that if we need to 301 redirect all the links? Is there any other smarter/ more efficiently way to deal with them like set up canonical link (I thought that's what they're used for isn't it?) Thanks!
White Hat / Black Hat SEO | | greenshinenewenergy0 -
Press Releases. What are the reasons (if any) that I should even consider them?
For the last couple years, i believed common knowledge that press releases was NOT the way to go for SEO. So I avoided it like crazy! Now, I'm doing some research and I realize that some (or alot) companies still create press releases to get the word out so that they can spread the message. I am going to be signing with some clients that are bigger brands and I want to do everything in our power to get the word out. I've done social outreach, commenting and cold emails to influencers before, but now I've been seeing some competitors using press releases with a little success, I want to ask the community, is there a method they recommend of testing efficacy of press releases? My biggest fear is using press releases then months/years down the line be hit hard by a penalty just for testing different PR sites.
White Hat / Black Hat SEO | | JasonKhoo0 -
Does type of hosting affect SEO rankings?
Hello, I was wondering if hosting on shared, versus VPS, versus dedicated ... matter at all in terms of the rankings of Web sites ... given that all other factors would be exactly equal. I know this is a big question with many variables, but mainly I am wondering if, for example, it is more the risk of resource usage which may take a site down if too much traffic and therefore make it un-crawlable if it happens at the moment that a bot is trying to index the site (factoring out the UX of a downed site). Any and all comments are greatly appreciated! Best regards,
White Hat / Black Hat SEO | | uworlds
Mark0 -
How does google know if rich snippet reviews are fake?
According to: https://developers.google.com/structured-data/rich-snippets/reviews - all someone has to do is add in some html code and write the review. How does google do any validation on whether these reviews are legitimate or not?
White Hat / Black Hat SEO | | wlingke0 -
Can a domain name alone be considered SPAM?
If someone has a domain that is spammy, such as "http://seattlesbestinsurancerates.com" can this cause Google to not index the website? This is not our domain, but a customer of ours has a similar one and it appears to be causing issues! Any thoughts? Thanks for any input!
White Hat / Black Hat SEO | | Tosten0 -
Is horizontal hashtag linking between 4 different information text pages with a canonical tag to the URL with no hashtag, a White Hat SEO practice?
Hey guys, I need help. hope it is a simple question : if I have horizontal 4 text pages which you move between through hashtag links, while staying on the same page in user experience, can I canonical tag the URL free of hashtags as the canonical page URL ? is this white hat acceptable practice? and will this help "Adding the Value", search queries, and therefore rank power to the canonical URL in this case? hoping for your answers. Best Regards, and thanks in advance!
White Hat / Black Hat SEO | | Muhammad_Jabali0 -
Hidden H1 tag - ?permissable
Until now I have been building websites either from scratch or with a template. Recently I decided to learn Adobe Dreamweaver. At the end of the first "Building a Website using Dreamweaver" lesson, the author notes the site is done but an H1 tag is missing. The instructor advises "The page doesn't have a top-level heading ( ). The design uses the banner image instead. This looks fine in a browser, but search engines and screen readers expect pages to be organized with a proper hierarchy of headings: at the top of the page, ..." The instructor then walks readers step-by-step into creating an H1 tag and using absolute positioning of -500px top to cause the tag to not be visible. My initial thought was the instructor was completely wrong for offering this advise, and users would be banned from search engines for following these instructions. I had planned to contact the writer and suggest the instructions be modified. Prior to doing such, I wanted to request a bit of feedback. The banner image's text in this example is "Check Magazine: Fashion and Lifestyle". The H1 tag that is created and positioned off-screen uses that exact same text. In an old blog comment, Matt Cutts shared "If you’re straight-out using CSS to hide text, don’t be surprised if that is called spam. I’m not saying that mouseovers or DHTML text or have-a-logo-but-also-have-text is spam; I answered that last one at a conference when I said “imagine how it would look to a visitor, a competitor, or someone checking out a spam report. If you show your company’s name and it’s Expo Markers instead of an Expo Markers logo, you should be fine. If the text you decide to show is ‘Expo Markers cheap online discount buy online Expo Markers sale …’ then I would be more cautious, because that can look bad.”" I would like to get some mozzer feedback on this topic. Do you view this technique as white hat? black hat? or grey hat?
White Hat / Black Hat SEO | | RyanKent0