Fetch as Google temporarily lifting a penalty?
-
Hi, I was wondering if anyone has seen this behaviour before? I haven't!
We have around 20 sites and each one has lost all of its rankings (not in index at all) since the medic update apart from specifying a location on the end of a keyword.
I set to work trying to identify a common issue on each site, and began by improving speed issues in insights. On one site I realised that after I had improved the speed score and then clicked "fetch as google" the rankings for that site all returned within seconds.
I did the same for a different site and exactly the same result. Cue me jumping around the office in delight! The pressure is off, people's jobs are safe, have a cup of tea and relax.
Unfortunately this relief only lasted between 6-12 hours and then the rankings go again. To me it seems like what is happening is that the sites are all suffering from some kind of on page penalty which is lifted until the page can be assessed again and when it is the penalty is reapplied.
Not one to give up I set about methodically making changes until I found the issue. So far I have completely rewritten a site, reduced over use of keywords, added over 2000 words to homepage. Clicked fetch as google and the site came back - for 6 hours..... So then I gave the site a completely fresh redesign and again clicked fetch as google, and same result. Since doing all that, I have swapped over to https, 301 redirected etc and now the site is completely gone and won't come back after fetching as google. Uh!
So before I dig myself even deeper, has anyone any ideas?
Thanks.
-
Unfortunately it's going to be difficult to dig deeper into this without knowing the site - are you able to share the details?
I'm with Martijn that there should be no connection between these features. The only thing I have come up with that could plausibly cause anything like what you are seeing is something related to JavaScript execution (and this would not be a feature working as it's intended to work). We know that there is a delay between initial indexing and JavaScript indexing. It seems plausible to me that if there were a serious enough issue with the JS execution / indexing that either that step failed or that it made the site look spammy enough to get penalised that we could conceivably see the behaviour you describe - where it ranks until Google executes the JS.
I guess my first step to investigating this would be to look at the JS requirements on your site and consider the differences between with and without JS rendering (and if there is any issue with the chrome version that we know executes the JS render at Google's side).
Interested to hear if you discover anything more.
-
Hey, it’s good to have a fresh pair of eyes on it. There may something else as simple that I’ve missed.
Rankings check on my Mac, 2 x webservers via Remote Desktop in different locations, 2 x independent ranking checkers and they all show the same... recovery for 12ish hours, along with traffic and conversions. Then nothing.
When the rankings dissapear I can hit fetch as again straight away and they come straight back. If I leave it for days, then they’re gone for days, until I hit fetch then straight back.
cheers
-
Ok, that still doesn't mean that they're not personalized. But I'll skip on that part for now.
In the end, the changes that you're seeing aren't triggered by what you're doing with Fetch as Google. I'll leave it up to some others to see if they'll shine a light on the situation. -
Hi,
Thanks,
They’re not personalised as my ranking checkers don’t show personalised results
-
Hi,
I'm afraid I have to help you with this dream, there is no connection whatsoever between the rankings and the feature to Fetch as Google within Google Search Console. What likely is happening is that you're already getting personalized results and within a certain timeframe, the ads won't be shown as the results will be different as Google thinks that you've already seen the first results on the page the first time that you Googled this.
Fetch as Google doesn't provide any signal to the regular ranking engines to say: "Hey, we've fetched something new and now it's going to make an impact on this". Definitely not at the speed that you're describing (within seconds).
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What about a No-record backlink in the eye of Google
I have an uncertainty - when I make a backlink as a piece of SEO in some site when I reviewed similar couple of days after. It hasn't filed and I checked its robots record. It appearing Client specialist: Mediapartners-Google Disallow: User-Agent: * Disallow: However, is this make any backlink uphold or only this with the end goal of not ordering in google. I make it straightforward - "Is this sort of backlink creation my site My Music Goals uphold my SEO action or Not?" In this No-record site.
Technical SEO | | Hadia7680 -
Google Gives me different results
Hello Everybody My website is giving me 2 results for my page ...
Technical SEO | | falguniwpi
if i search in google.com by "domain.com" it shows results without current pagetitles and if i search in google.com using keyword phrase "site:adomain.com", then it shows me all my website pages with current meta titles . why i am facing these 2 results and how to solve these . Why This happens, I have correctly redirectd URL to domain.com to www.domain.com, but Still it same, please Suggest me, what should i do With,0 -
Should I use the Google disavow tool?
Hi I'm a bit new to SEO and am looking for some guidance. Although there is no indication in Webmaster tools that my site is being penalised for bad links, I have noticed that I have over 200 spam links for "Pay Day Loans" pointing to my site. (This was due to a hack on my site several years ago). So my question is two fold. Firstly, is it normal to have spammy links pointing to your site and secondly, should I bother to do anything about it? I did some research into the Disavow tool in Webmaster tools wonder I should use it to block all these links. Thanks
Technical SEO | | hotchilidamo0 -
How to activate Page Fetching and Rendering
We have a coupons and deal website. Coupons are added and removed from the website on a daily basis. But crawler isn't crawling it that often. Lately we started fetching and rendering the page, but that is a time taking task as we have more than 500 stores with coupons. So, I was looking for some API or some method using which the crawler would crawl the website as defined. Suppose "x" store page should be crawled every alternate day as we daily update the coupons there, whereas "y" store coupons are update fortnightly so , they can be crawled weekly. Can somebody suggest me something..
Technical SEO | | jaintechnosoft0 -
About how google works
Hey guyz,
Technical SEO | | atakala
I want to ask a basic question. If I search for Larry Page lets say.
I think google look for it's index for word larry and page distinct.
And mix it up. But the question ;
Can google show a result which only Larry exist on the page but any of the synonym or the stem of the Page not exist .
If it can happen how this page can be showned in larry page query. Thank you.0 -
My whole directory dropped from google
Hi there I have lost a whole directory in Google rankings and this is where my traffic was coming from. Do you know why this would happen? It is the eCommerce part of the site. So if I do site:www.domain.com the site is there and then site:www.domain.com/directory is not there any more. Thanks for your help.
Technical SEO | | vcasebourne0 -
How to improve ranking of a website again, after being penalized by Google?
The ranking of our website has gone down in past 2 months. The reason,
Technical SEO | | TGA123
I believe is that we had more than 300,000 spammy comments posted on it
(the website is based on wordpress) so Google treated it as
un-monitored forum and penalized. We have deleted the older comments
and new comments can no longer be posted. Need suggestions on what else
should we do to rank better. Any advice would be very welcome.0 -
Sitemap.xml problem in Google webmaster
Hi, My sitemap.xml is not submitting correctly in Google Webmaster. There is 697 url submitted but only 56 are in Google index. At the top of webmaster this is what it says ->>> http://www.example.com/sitemap.xml has been resubmitted. But when when I clicked status button RED X occurs. Any suggestions about this, thanks...
Technical SEO | | Socialdude0