Hiding ad code from bots
-
Hi. I have a client who is about to deploy ads on their site. To avoid bots clicking on those ads and skewing data, the company would like to prevent any bots from seeing any ads and, of course, that includes Googlebot. This seems like it could be cloaking and I'd rather not have a different version of the sites for bots. However, knowing that this will likely happen, I'm wondering how big of a problem it could be if they do this. This change isn't done to manipulate Googlebot's understanding of the page (ads don't affect rankings, etc.) and it will only be a very minimal impact on the page overall.
So, if they go down this road and hide ads from bots, I'm trying to determine how big of a risk this could be. I found some old articles discussing this with some suggesting it was a problem and others saying it might be okay in some cases (links below). But I couldn't find any recent articles about this. Wondering if anybody has seen anything new or has a new perspective to share on this issue? Is it a problem if all bots (including Googlebot) are unable to see ads?
https://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
https://www.webmasterworld.com/google/4535445.htm
https://www.youtube.com/watch?v=wBO-1ETf_dY -
Hello Mathew Edgar,
To make it simple for you, we give you few steps here to implement in your client's website that could results in possibilities of no penalty from search engines [ especially from Google ]
1. keep the content & URL unique from other pages
2. Avoid flash or scripts that makes the web-page to load slower
3. Try to keep ONLY 3 - 5 Ads [ max of text based with low portion of images ] maximum in the web-page
4. Do not OPTIMIZE the page i.e., for keywords rankings, organic results, back-links etc
5. Give the images Name & ALT Text for easier crawling
Also usually Bots just crawl a web-page / domain instead making clicks. Bots only make sure that the page is crawl-able with search engine [ Google ] guidelines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hiding Elements on Mobile. Will this effect SEO.
Hey guys and gals, I am hiding elements with @media sizes on the mobile experience for this site. http://prepacademyschools.org/ My question is when hiding elements from mobile, will this have a negative effect on rankings for mobile and or desktop? Right now it is a hero banner and testimonial. My interest is because I feel responsive is now working against conversions when it comes to mobile because desktop typically has the same info several times where mobile it can be repetitive and only needed once. Thanks,
White Hat / Black Hat SEO | | brightvessel1 -
Apparent Bot Queries and Impressions in Webmaster Tools
I've been noticing some strange stats in Google Webmaster Tools for my forum, which has been getting spam queries with impressions and no clicks. See the queries in the attached images. This might be a motive for the spammers or scrapers. I set the date range to just 22 Aug - 22 Nov and I see very obviously the spike is due to impressions. Questions: What should/can I do? Is Google doing something about this? How to avoid this? o6gKB
White Hat / Black Hat SEO | | SameerBhatia0 -
Negative SEO Click Bot Lowering My CTR?
I am questioning whether one of our competitors is using a click bot to do negative SEO on our CTR for our industry's main term. Is there any way to detect this activity? Background: We've previously been hit by DoS attacks from this competitor, so I'm sure their ethics/morals wouldn't prevent them from doing negative SEO. We sell an insurance product that is only offered through broker networks (insurance agents) not directly by the insurance carriers themselves. However, our suspect competitor (another agency) and insurance carriers are the only ones who rank on the 1st page for our biggest term. I don't think the carrier sites would do very well since they don't even sell the product directly (they have pages w/ info only) Our site and one other agency site pops onto the bottom of page one periodically, only to be bumped back to page 2. I fear they are using a click bot that continuously bounces us out of page 1...then we do well relatively to the other pages on page 2 and naturally earn our way back to page 1, only to be pushed back to page 2 by the negative click seo...is my theory. Is there anything I can do to research whether my theory is right or if I'm just being paranoid?
White Hat / Black Hat SEO | | TheDude0 -
Malicious bot attack?
Several of our websites have experienced a major direct load traffic spike in the last 30 days - roughly 40K new visitors for each site. The bots are emulating IE9 and appear to be hitting our home page and bouncing 100% of the time. The traffic is double our usual volume, or more. Our bounce rates, conversion rate, page views, etc have suffered accordingly. The volume hasn't affected site performance, yet. Since the traffic is direct load, I can't see this being a negative SEO attack. Plus, our search visibility for everything but our brands is abysmal - there aren't any real rankings to tank. Our engineers are saying that the IP addresses are diverse, and they aren't seeing any pattern. I also checked GA for traffic locations, and we aren't seeing anything unusual from overseas.It appears that the attack is US based. Has anyone seen this before?
White Hat / Black Hat SEO | | AMHC0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Bot or Virus Creating Bad Links?
Hey Everyone, We are getting ready to engage a client for some potential marketing/SEO so in preparing for this have ran the site through OpenSiteExplorer. The site is relatively new and there are only two links under the inbound links section. They are relevant and add value, no issues there. Here is where it get strange. When I look under the 'Just Discovered' section there are many (hundreds) new links going back about a month. Virtually all of them have the anchor text 'Louis Vuitton outlet'. Now the client swears he has not engaged anyone for black hat SEO, so wondering who could possibly be creating these links. They do sell some Louis Vuitton items on the site, so I'm wondering if it is possible that some spam bot has picked up the site and began to spam the web with links to the clients site. So far today, 50 or so new links have been created with said anchor text and the clients root URL all on very poor quality, some foreign blog sites. Would like to find out why this is happening and put a stop to it for obvious reasons. Has anyone experienced something similar? Could this be a bot? Or maybe someone with an axe to grind against the client? Anyone could be doing this on their own, but just seems strange for it to be happening to a new site that does not even rank highly at the moment. Any advice or info is greatly appreciated, thanks in advance.
White Hat / Black Hat SEO | | Whebb0 -
Do searchs bot understand SEF and non SEF url as the same ones ?
I've jsut realized that since almost for ever I use to code first my website using the non sef for internal linkings. It's very convenient as I'm sure that what ever will be the final url the link will always be good. ex: website.com/component1/id=1 Before releasing the website I use extensions to make the url user friendly according the choosen strategy. ex: website.com/component1/id=1 -> website.com/article1.html But I just wondered if google consider both urls as the same ones or if it consider just as a 301 redirection. What do you think is the best to do ?
White Hat / Black Hat SEO | | AymanH0 -
Yahoo Slurp Bot 3.0 Going Crazy
On one of our sites, since the Summer, Yahoo Slurp bot has been crawling our pages at about 5 times a minute. We have put a crawl delay on it and it does not respect our robots.txt. Now the issue is it's triggering javascript (which bots shouldn't) triggering our adsense, ad server, analytics information, etc. We've thought of banning the bot all together but get a good amount of Yahoo traffic. We've though about programmatic-ly not showing the javascript (ad + analytic) tags but are slightly afraid the Yahoo might consider this cloaking. What are the best practices to deal with this bad bot.
White Hat / Black Hat SEO | | tony-755340