Fetch as Google temporarily lifting a penalty?
-
Hi, I was wondering if anyone has seen this behaviour before? I haven't!
We have around 20 sites and each one has lost all of its rankings (not in index at all) since the medic update apart from specifying a location on the end of a keyword.
I set to work trying to identify a common issue on each site, and began by improving speed issues in insights. On one site I realised that after I had improved the speed score and then clicked "fetch as google" the rankings for that site all returned within seconds.
I did the same for a different site and exactly the same result. Cue me jumping around the office in delight! The pressure is off, people's jobs are safe, have a cup of tea and relax.
Unfortunately this relief only lasted between 6-12 hours and then the rankings go again. To me it seems like what is happening is that the sites are all suffering from some kind of on page penalty which is lifted until the page can be assessed again and when it is the penalty is reapplied.
Not one to give up I set about methodically making changes until I found the issue. So far I have completely rewritten a site, reduced over use of keywords, added over 2000 words to homepage. Clicked fetch as google and the site came back - for 6 hours..... So then I gave the site a completely fresh redesign and again clicked fetch as google, and same result. Since doing all that, I have swapped over to https, 301 redirected etc and now the site is completely gone and won't come back after fetching as google. Uh!
So before I dig myself even deeper, has anyone any ideas?
Thanks.
-
Unfortunately it's going to be difficult to dig deeper into this without knowing the site - are you able to share the details?
I'm with Martijn that there should be no connection between these features. The only thing I have come up with that could plausibly cause anything like what you are seeing is something related to JavaScript execution (and this would not be a feature working as it's intended to work). We know that there is a delay between initial indexing and JavaScript indexing. It seems plausible to me that if there were a serious enough issue with the JS execution / indexing that either that step failed or that it made the site look spammy enough to get penalised that we could conceivably see the behaviour you describe - where it ranks until Google executes the JS.
I guess my first step to investigating this would be to look at the JS requirements on your site and consider the differences between with and without JS rendering (and if there is any issue with the chrome version that we know executes the JS render at Google's side).
Interested to hear if you discover anything more.
-
Hey, it’s good to have a fresh pair of eyes on it. There may something else as simple that I’ve missed.
Rankings check on my Mac, 2 x webservers via Remote Desktop in different locations, 2 x independent ranking checkers and they all show the same... recovery for 12ish hours, along with traffic and conversions. Then nothing.
When the rankings dissapear I can hit fetch as again straight away and they come straight back. If I leave it for days, then they’re gone for days, until I hit fetch then straight back.
cheers
-
Ok, that still doesn't mean that they're not personalized. But I'll skip on that part for now.
In the end, the changes that you're seeing aren't triggered by what you're doing with Fetch as Google. I'll leave it up to some others to see if they'll shine a light on the situation. -
Hi,
Thanks,
They’re not personalised as my ranking checkers don’t show personalised results
-
Hi,
I'm afraid I have to help you with this dream, there is no connection whatsoever between the rankings and the feature to Fetch as Google within Google Search Console. What likely is happening is that you're already getting personalized results and within a certain timeframe, the ads won't be shown as the results will be different as Google thinks that you've already seen the first results on the page the first time that you Googled this.
Fetch as Google doesn't provide any signal to the regular ranking engines to say: "Hey, we've fetched something new and now it's going to make an impact on this". Definitely not at the speed that you're describing (within seconds).
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing - what did I missed??
Hello, all SEOers~ I just renewed my web site about 3 weeks ago, and in order to preserve SEO values as much as possible, I did 301 redirect, XML Sitemap and so on for minimize the possible data losses. But the problem is that about week later from site renewal, my team some how made mistake and removed all 301 redirects. So now my old site URLs are all gone from Google Indexing and my new site is not getting any index from Google. My traffic and rankings are also gone....OMG I checked Google Webmaster Tool, but it didn't say any special message other than Google bot founds increase of 404 error which is obvious. Also I used "fetch as google bot" from webmaster tool to increase chance to index but it seems like not working much. I am re-doing 301 redirect within today, but I am not sure it means anything anymore. Any advise or opinion?? Thanks in advance~!
Technical SEO | | Yunhee.Choi0 -
Should I remove these pages from the Google index?
Hi there, Please have a look at the following URL http://www.elefant-tours.com/index.php?callback=imagerotator&gid=65&483. It's a "sitemap" generated by a Wordpress plug-in called NextGen gallery and it maps all the images that have been added to the site through this plugin, which is quite a lot in this case. I can see that these "sitemap" pages have been indexed by Google and I'm wondering whether I should remove these or not? In my opinion these are pages that a search engine would never would want to serve as a search result and pages that a visitor never would want to see. Attracting any traffic through Google images is irrelevant in this case. What is your advice? Block it or leave it indexed or something else?
Technical SEO | | Robbern0 -
Determining the Cause of a Penalty
I received a link removal request from a site who said that they were penalized. I confirmed that they were #1 for the competitive keyword phrase that is also their domain name and now they are #10. Here are some things I noticed about the site: Over 2,500 linking domains. Dozens of high quality linking domains like Huffington Post and Mashable. Some off topic guest post links, e.g. on a SEO site. Guest post anchor text was usually their site name which is an exact match domain. Lots of top 100 resource pages that received good organic links. Infographics with links using their domain name as the anchor text. Relatively few spammy links according to Open Site Explorer. Overall their site's links were engineered but using tactics that most would consider "white hat." I don't think they violated any Google Webmaster Guidelines. Why were they penalized? What do you think?
Technical SEO | | ProjectLabs0 -
How to Stop Google from Indexing Old Pages
We moved from a .php site to a java site on April 10th. It's almost 2 months later and Google continues to crawl old pages that no longer exist (225,430 Not Found Errors to be exact). These pages no longer exist on the site and there are no internal or external links pointing to these pages. Google has crawled the site since the go live, but continues to try and crawl these pages. What are my next steps?
Technical SEO | | rhoadesjohn0 -
Google authorship syntax, plus no follow
I have seen two forms of rel=author syntax. Are they both valid? (1) (2) Second, does 'no follow' take away authorship? Is there any point in doing rel=author for a link that is rel=no follow? Like this:
Technical SEO | | scanlin0 -
One Keyword Penalty
Hi There, Quick question for everyone. Is it possible to get penalized a keyword level not page level. I have a site that only seems to be penalized on one keyword which is currently at page 22, whilst the rest are on page 1 or page 2. I came to the site late so I have no idea when the site lost its ranking for this keyword after a site redesign but the onpage is almost the same. Kind Regards Neil
Technical SEO | | nezona0 -
Dramatic Decrease in Google Organic Traffic Indicates a Penalty But None Found
So we've been having some difficulty with one of our websites since we split it in half and moved one section of content to a new domain with a new name, at the end of May. http://www.dialtosave.co.uk/mobile/ was moved to http://www.somobile.co.uk And in the following 6 weeks, the google organic traffic has fallen to miniscule levels, that seem to indicate a more serious issue than just low ranking. Initially when the site was moved, the 301s transferred the authority very quickly and the new website pages ranked well. Now, some of them simply won't rank at all unless you include the name of the website "somobile". Here is one of the current rankings that indicates an issue:
Technical SEO | | purpleindigo
"somobile" - 1
"somobile mobile phones" - not in top 50 These are some of the terms we used to rank in the top 10 on Google UK, and still do on Bing UK, but don't rank in the top 50 on Google UK now:
samsung galaxy ace
apple iphone 5 deals
samsung tocco icon Our webmaster central account says that only 30% of the pages in our sitemap are in the index. It seems like a penalty has been imposed, but our reconsideration request (just submitted because it seemed like a sensible next step) came back saying there were no manual actions taken. Can you see what it is that might be causing the problem for us? I would have thought it was the new domain (with less direct links and less brand credibility), or content issues, but I would have thought that would just reduce the ranking by a few pages rather than just hide the pages altogether.0