Hiding ad code from bots
-
Hi. I have a client who is about to deploy ads on their site. To avoid bots clicking on those ads and skewing data, the company would like to prevent any bots from seeing any ads and, of course, that includes Googlebot. This seems like it could be cloaking and I'd rather not have a different version of the sites for bots. However, knowing that this will likely happen, I'm wondering how big of a problem it could be if they do this. This change isn't done to manipulate Googlebot's understanding of the page (ads don't affect rankings, etc.) and it will only be a very minimal impact on the page overall.
So, if they go down this road and hide ads from bots, I'm trying to determine how big of a risk this could be. I found some old articles discussing this with some suggesting it was a problem and others saying it might be okay in some cases (links below). But I couldn't find any recent articles about this. Wondering if anybody has seen anything new or has a new perspective to share on this issue? Is it a problem if all bots (including Googlebot) are unable to see ads?
https://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
https://www.webmasterworld.com/google/4535445.htm
https://www.youtube.com/watch?v=wBO-1ETf_dY -
Hello Mathew Edgar,
To make it simple for you, we give you few steps here to implement in your client's website that could results in possibilities of no penalty from search engines [ especially from Google ]
1. keep the content & URL unique from other pages
2. Avoid flash or scripts that makes the web-page to load slower
3. Try to keep ONLY 3 - 5 Ads [ max of text based with low portion of images ] maximum in the web-page
4. Do not OPTIMIZE the page i.e., for keywords rankings, organic results, back-links etc
5. Give the images Name & ALT Text for easier crawling
Also usually Bots just crawl a web-page / domain instead making clicks. Bots only make sure that the page is crawl-able with search engine [ Google ] guidelines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Malicious links on our site indexed by Google but only visible to bots
We've been suffering from some very nasty black hat seo. In Google's index, our pages show external links to various pharmaceutical websites, but our actual live pages don't show them. It seems as though only certain user-agents see the malicious links. Setting up Screaming Frog SEO crawler using the Googlebot user agent also sees the malicious links. Any idea what could have caused this or how this can be stopped? We scanned all files on our webserver and couldn't find any of malicious links. We've changed our FTP and CMS passwords, is there anything else we can do? Thanks in advance!
White Hat / Black Hat SEO | | SEO-Bas0 -
Cloaking/Malicious Code
Does anybody have any experience with software for identifying this sort of thing? I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code. Thanks everybody!
White Hat / Black Hat SEO | | HashtagHustler0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Site dropping in rank even through there are more backlinks being added
Hello, One of my client's sites is ranking lower than he should. This happened when we took off backlinks (20 little blogs, several site-wide paid links. It really dropped the site, but it had to be done. Since then we've increased his # of root domains by 10% through white hat link building in his non-competitive niche, and rankings are still poor. I know that's not much in the way of added backlink value, but we're working on it. My question is, how have the recent (and coming) updates possibly effected us. We want to take the remaining problem areas off right away, but another drop in traffic is not a good idea. Even though the blogs (see below) have no backlinks of themselves, they cause drops when taken off) He still has -20 little blog backlinks w/ a quarter of them being exact match anchor text.
White Hat / Black Hat SEO | | BobGW
-1 sitewide paid link - an image, exact match alt tag anchor text
-1 non-site-wide paid links that is an image near the footer, exact match alt tag anchor text.
-3 links on a domain, this one looks fairly editorial, but there are a bunch of paid links on that page. Changing to non-exact-match anchor text
-2 links on two domains that look completely editorial with no other paid links on that page. non-exact-match anchor text -70 backlinks total with about 1/3 being problematic. How does this site look in regards to updates and when to take links off without tanking our site even more? Thanks.0 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
White Hat / Black Hat SEO | | NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0 -
How to Get Backlinks to a Coupon Code Website
Hello Guys, I run a coupon code website, which by its very nature does not contain the most compelling of content. As you can probably understand, not many people are going to want to link to a page which lists a number of coupons relating to a specific online retailer. I am really struggling to come up with new and innovative ways of attracting links and wondered if anybody was in a similar position to me or could offer some advice. Would love to get some feedback. Thanks!
White Hat / Black Hat SEO | | Marc-FIMA1 -
Access Denied - 2508 Errors - 403 Response code in webmaster tools
Hello Fellow members, From 9th may I am getting this error messages & these crawl errors is increasing daily. Google is not able to crawl my URLS & getting 403 response code & saying ACCESS Denied Errors in GWT. My all Indexed pages are de-indexed. Why I am receiving this errors ? My website is working fine but why Google is not able to crawl my pages. PLEASE TELL ME what is the ISSUE, I need to resolve ASAP on 9th may I got a message in GWT as well for "http://www.mysitename.co.uk/ Increase in authorization permission errors " Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors. After this all problem started. Kindly tell what is the issue & how can I solve this. WGsu8pU
White Hat / Black Hat SEO | | sourabhrana390 -
Banner Ads help seo?
I see in OSE banner ads counting ads as incoming links - My question is has anyone done a study showing a non tagged banner ad link and its effects on seo? Does google counting it as organic since it has no tagging or since its in a ad spot its ignored?
White Hat / Black Hat SEO | | DavidKonigsberg0