How good/bad the exit intent pop-ups? What is Google's perspective?
-
Hi all,
We have launched the exit intent pop-ups on our website where a pop-up will appear when the visitor is about to leave the website. This will trigger when the mouse is moved to the top window section; as an attempt by the visitor to close the window. We see a slight ranking drop post this pop-up launch. As the pop-up is appearing just before someone leaves the website; does this making Google to see as if the user left because of the pop-up and penalizing us? What is your thoughts and suggestions on this?
Thanks
-
Google's John Mueller has stated that exit intent pop-ups do not attract a Google penalty:
https://www.youtube.com/watch?time_continue=746&v=gS4_JH-QqSg
"What we’re looking for is really interstitials that show up on the interaction between the search click and going through the page and seeing the content. So that’s kind of the place we’re looking for those interstitials. What you do afterwards like if someone clicks on stuff within your website or closes the tab or something like that then that’s kind of between you and the user."Google won't be monitoring the user's full behaviour on your site beyond the initial bounce/non-bounce (at least as far as the search arm is considered, Analytics is, of course, different!), e.g. they won't see that x happened that caused (or appeared to cause) the user to leave.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why google is catching my website late
Hello, I hope you all guys are doing great. Recently, I published my over my website and within almost 10 mins, it was indexed completely and I also personally checked it in google search console. The URL was indexed but the problem is, it does not appear in Google Search. Sometimes in search result I notice Google shows a result who is published 10-30 mins ago but this is not the case with my website. All articles just show in Google SERP after 1-2 days. What can be the reason behind this, although DA, PA is good (28-31).
White Hat / Black Hat SEO | | HansiAliya0 -
Exchange link from sites in same google account
Hi everyone, Anybody have experience when you have some websites which stored in Google Webmaster Tool and they exchange links between sites. So is it good for sites? We are hosted on different server. Thank you so much
White Hat / Black Hat SEO | | Jeepster0 -
Are businesses still hiring SEO that use strategies that could lead to a Google penalty?
Is anyone worried that businesses know so little about SEO that they are continuing to hire SEO consultants that use strategies that could land the website with a Google penalty? I ask because we did some research with businesses and found the results worrying: blog farms, over optimised anchor text. We will be releasing the data later this week, but wondered if it something for the SEO community to worry about and what can be done about it.
White Hat / Black Hat SEO | | williamgoodseoagency.com0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Does Trade Mark in URL matter to Google
Hello community! We are planning to clean up TM and R in the URLs on the website. Google has indexed these pages but some TM pages are have " " " instead displaying in URL from SERP. What's your thoughts on a "spring cleaning" effort to remove all TM and R and other unsafe characters in URLs? Will this impact indexed pages and ranking etc? Thank you! b.dig
White Hat / Black Hat SEO | | b.digi0 -
How will Google deal with the crosslinks for my multiple domain site
Hi, I can't find any good answer to this question so I thought, why not ask Moz.com ;-)! I have a site, let's call it webshop.xx For a few languages/markets, Deutsch, Dutch & Belgian, English, French. I use a different TLD with a different IP for each of these languages, so I'll end up with: webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening) My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made: I get full external links juice (content is translated so unique?) I get a bit of the juice of an external link They are actually seen as internal links I'll get a penalty Thanks in advance guys!!!
White Hat / Black Hat SEO | | pimarketing0 -
Our site has too many backlinks! How can we do a bad backlink audit?
Webmaster Tools is saying we have close to 24 million links to our site. The site has been around since the mid 90s and has accumulated all these links since. We also have our own network of sites that have links in their templates to our main site. I'm fighting to get these links "nofollow"'d but upper management seems scared to alter this practice. This past year we've found our rankings have dropped significantly and suspect it's due to some spammy backlinks or being penalized for doing an accidental link scheme network. 24 million links is too many to check manually for using the disavow tool and it seems that bulk services out there to check backlinks can't even come close. What's an SEO to do?
White Hat / Black Hat SEO | | seoninjaz0 -
Is Best of the Web a good directory to pay to be listed on?
We are currently paying to have a listing in the directory Best of the Web. Should I be paying to renew our listing in this directory?
White Hat / Black Hat SEO | | djlittman0