January 10, 2017 - Intrusive interstitials Google Update
-
Hi all,
As everyone is most likely aware, Google have recently announced that if a site has intrusive intersitals that push the main content below the fold, will be downgraded in the SERP's from January 10th.
At the moment we have a range of international sites, .ca, .com.au, .co.uk, .fr etc - if a user from a UK IP goes to a .ca site - a country switcher dialog will appear.
I am aware that this may affect our sites performance in mobile search when the update comes out - however, if we block Google from seeing this - will they still pick it up?
Thanks.
-
Well if you would hide it from only Google that would be cloaking to a certain extend. But in this case I think you can easily get away with it as it will improve the user experience.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google inaccurate results: Common or error?
Hi all, While searching for our primary keyword, I can see 2 websites on second page results which are non-related to the keyword or industry but their company name is this keyword. Like if I want to rank and searching for "SEO", there are 2 websites which called "seo trucks" and "seo paints". I wonder how Google is ranking these websites for high competition keyword with 1 million searches per month. So the keyword in URL and this keyword mentioned across the website being their brand name taking over the other potential ranking factors like backlinks, relevant content, user clicks, etc..... Thanks
Algorithm Updates | | vtmoz0 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Removing an old Google places listing for a newer version?
Hey there, I was wondering whether you could help me out on the following; One of our clients has a Google places listing that we created for their business but it appears to be being blocked - or at least conflicting - with an old listing. As such, Google appears to be showing the old listing with an outdated URL and company name - rather than the new one. Does anyone know how I can go about removing this listing or showing that the newer one is now more relevant? Unfortunately, I don't have the logins for the old places listing. Old listing; https://plus.google.com/105224923085379238289 New listing; https://plus.google.com/b/114641937407677713536/114641937407677713536
Algorithm Updates | | Webrevolve0 -
Page details in Google Search
I noticed this morning a drop in the SERPs for a couple of my main keywords. And even though this is a little annoying the more pressing matter is that Google is not displaying the meta title I have specified for the majority of my sites pages, despite one being specified and knowing my site has them in place. Could this sudden change to not using my specified title be the cause of the drop, and why would they be being displayed by Google in the first place, when they are there to be used. The title currently being displayed inthe SERPs is not anything that has been specified in the past or from the previous latest crawl etc. Any insight would be appreciated. Tim
Algorithm Updates | | TimHolmes0 -
Google Multiple Results
With Google's penchant for listing at times many results - one on top of the other - from the same domain, is it now advisable to not worry about having multiple pages in the same site targeting the same or very similar keywords? Is this (keyword/page internal competition) one less thing that I have to worry about or worry about less or what? Thanks! Best... Jane
Algorithm Updates | | 945010 -
Algorithmic Update?
Does anyone know if there's been an algorithmic update for google.co.uk in the last couple of days? My site has dropped 15 places or more for all of it's terms and I'm trying to work out the cause!!! Thanks.
Algorithm Updates | | PeterAlexLeigh0 -
How To Rank High In Google Places?
Hello SEOmoz, This question has been hounding me for a long time and I've never seen a single reliable information from the web that answers it. Anyway here's my question; Supposing that there are three Google places for three different websites having the same categories and almost same keywords and same district/city/IP how does Google rank one high from the other? Or simply put if you own one of those websites and you would want to rank higher over your competitors in Google places Search results how does one do it? A number of theories were brought up by some of my colleagues: 1. The age of the listing 2. The number of links pointing to the listing (supposing that one can build links to ones listing) 3. The name/url of the listing, tags, description, etc. 4. The address of the listing. 5. Authority of the domain (linked website) You see some listings have either no description, and only one category and yet they rank number one for a specific term/keyword whereas others have complete categories, descriptions etc. If you could please give me a definite answer I will surely appreciate it. Thank you very much and more power!
Algorithm Updates | | LeeAnn300 -
Is this a possible Google penalty scenario?
In January we were banned from Google due to duplicate websites because of a server configuration error by our previous webmaster. Around 100 of our previously inactive domain names were defaulted to the directory of our company website during a server migration, thus showing the exact same site 100 times... obviously Google was not game and banned us. At the end of February we were allowed back into the SERPS after fixing the issue and have since steadily regained long-tail keyword phrase rankings, but in Google are still missing our main keyword phrase. This keyword phrase brings in the bulk of our best traffic, so obviously it's an issue. We've been unable to get above position 21 for this keyword, but in Yahoo, Bing, and Yandex (Russian SE) we're positions 3, 3, and 7 respectively. It seems to me there has to be a penalty in effect, as this keyword gets between 10 and 100 times as much traffic in Google than any of the ones we're ranked for, what do you think? EDIT: I should mention in the 4-5 years prior to the banning we had been ranked between 15 and 4th in Google, 80% of the time on the first page.
Algorithm Updates | | ACann0