Hiding ad code from bots
-
Hi. I have a client who is about to deploy ads on their site. To avoid bots clicking on those ads and skewing data, the company would like to prevent any bots from seeing any ads and, of course, that includes Googlebot. This seems like it could be cloaking and I'd rather not have a different version of the sites for bots. However, knowing that this will likely happen, I'm wondering how big of a problem it could be if they do this. This change isn't done to manipulate Googlebot's understanding of the page (ads don't affect rankings, etc.) and it will only be a very minimal impact on the page overall.
So, if they go down this road and hide ads from bots, I'm trying to determine how big of a risk this could be. I found some old articles discussing this with some suggesting it was a problem and others saying it might be okay in some cases (links below). But I couldn't find any recent articles about this. Wondering if anybody has seen anything new or has a new perspective to share on this issue? Is it a problem if all bots (including Googlebot) are unable to see ads?
https://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
https://www.webmasterworld.com/google/4535445.htm
https://www.youtube.com/watch?v=wBO-1ETf_dY -
Hello Mathew Edgar,
To make it simple for you, we give you few steps here to implement in your client's website that could results in possibilities of no penalty from search engines [ especially from Google ]
1. keep the content & URL unique from other pages
2. Avoid flash or scripts that makes the web-page to load slower
3. Try to keep ONLY 3 - 5 Ads [ max of text based with low portion of images ] maximum in the web-page
4. Do not OPTIMIZE the page i.e., for keywords rankings, organic results, back-links etc
5. Give the images Name & ALT Text for easier crawling
Also usually Bots just crawl a web-page / domain instead making clicks. Bots only make sure that the page is crawl-able with search engine [ Google ] guidelines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Apparent Bot Queries and Impressions in Webmaster Tools
I've been noticing some strange stats in Google Webmaster Tools for my forum, which has been getting spam queries with impressions and no clicks. See the queries in the attached images. This might be a motive for the spammers or scrapers. I set the date range to just 22 Aug - 22 Nov and I see very obviously the spike is due to impressions. Questions: What should/can I do? Is Google doing something about this? How to avoid this? o6gKB
White Hat / Black Hat SEO | | SameerBhatia0 -
Malicious bots
I was looking at some recommended keywords and felt sick to my stomach when I saw ilovevitaly.com search shell, resellerclub scam and a few more. | 2. | | 28(2.29%)ilovevitaly.com search shell | 0.00% | 0(0.00%) | 42.86% | 1.75 | 00:10:13 | 0.00% | 0(0.00%) | $0.00(0.00%) |
White Hat / Black Hat SEO | | BlueprintMarketing
| | 3. | resellerclub scam | I believe I have found the multiple IP addresses in which they're coming from and when I say many I mean I found 200 or so. There from different C blocks so they're very difficult to block easily without blocking legitimate traffic. I'm using a couple of different web application firewalls with the ability to block it pretty much anything. Does anyone have any device on doing this in a manner that might be more efficient than what I'm doing.I definitely do not want Google to think this is something that I did and penalize somebody this would be horrible. The site is going through Sucuri.net to be cleaned of any possible infection right now I do not know how this happened but zero day attacks are unfortunately a very real reality and unfortunately it could've been 1 million things. Thanks a million guys. I appreciate your help,
Tom0 -
Why have bots (including googlebot) categorized my website as adult?
How do bots decide whether a website is adult? For example, I have a gifting portal, but strangely here, it is categorized as 'Adult'. Also, my google adsense application to run ads on my site got rejected - I have a feeling this is because googlebot categorized my site as adult. And there are good chances that other bots also consider it an adult website, rather than a gifting website. Can anyone please go through the site and tell me why this is happening? Thanks in advance.
White Hat / Black Hat SEO | | rahulkan0 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
White Hat / Black Hat SEO | | NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0 -
Cross-Site Links with different Country Code Domains
I have a question with the penguin update. I know they are really cracking down on "spam" links. I know that they are wanting you to shift from linking keywords to the brand name, unless it makes sense in a sentence. We have five sites for one company in the header they have little flag images, that link to different country domains. These domains all have relatively the same domain name besides the country code. My question is, linking these sites back and fourth to each other in this way, does it hurt you in penguin? I know they are wanting you to push your identity but does this cross-site scheme hurt you? In the header of these sites we have something like this. I am assuming the best strategy would probably be to treat them like separate entities. Or, just focus on one domain. They also have some sites that have links in the footer but they are set up like:
White Hat / Black Hat SEO | | AlliedComputer
For product visit Domain.com Should nofollows be added on these footer links as well? I am not sure if penguin finds them spammy too.0 -
Many sites added some excerpts of my Blog post and linking back ? Most of them are Spamy site !
Many sites added some excerpts of my Blog post and linking back ? Most of them are Spamy site ! Some are great blogs, but some blogs just copy some excerpts and link back to them - which i never approve. Will it affect my blog. i ask them to remove it. no use. !
White Hat / Black Hat SEO | | Esaky0 -
Access Denied - 2508 Errors - 403 Response code in webmaster tools
Hello Fellow members, From 9th may I am getting this error messages & these crawl errors is increasing daily. Google is not able to crawl my URLS & getting 403 response code & saying ACCESS Denied Errors in GWT. My all Indexed pages are de-indexed. Why I am receiving this errors ? My website is working fine but why Google is not able to crawl my pages. PLEASE TELL ME what is the ISSUE, I need to resolve ASAP on 9th may I got a message in GWT as well for "http://www.mysitename.co.uk/ Increase in authorization permission errors " Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors. After this all problem started. Kindly tell what is the issue & how can I solve this. WGsu8pU
White Hat / Black Hat SEO | | sourabhrana390 -
301 Redirect ASP code
Hi I have a script detailed below, that 301 redirects based upon different queries --- """"<%if (Request("offset") = "") Then%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") <> "" then'Sector and Location NOT NULL%> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") = "" then'Sector NOT NULL and Location NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") = "" AND Request("j") <> "" then'Sector NULL and Location NOT NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <%End if%>"""" But this still allows for both the www and non www versions of these pages to render in the browser, which is resulting in duplicate content. On my home page I use -- <% If InStr(Request.ServerVariables("SERVER_NAME"),"www") = 0 Then Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www." & Request.ServerVariables("HTTP_HOST") & "/" Response.End End if %> `Is there a good way to combine these, so that I still get all of the rules of the first script whilst also redirecting any non www versions to the www version? in other words
White Hat / Black Hat SEO | | TwoPints
domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation")))
Eould redirect to
www.domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Thanks in advance`0