Google: site gone from SERPs, back in 1 day, then gone again?
-
Last December we fumbled our 404 error page with a misconfigured server and broken links on the page.
Needless to say, our site dropped into an abyss - for 4 months. Yesterday we appeared again in our regular placement (actually stronger placement). Our site has been around since 09-Jan-1999 and has been a highly regarded site with good link structure and solid content for engineers.
Then today we're gone again. Yesterday morning we had 125,000 pages indexed with Google which grew to 187,000 by late afternoon. Then this morning we're nowhere to be found and only 21,800 pages are now indexed.
We've been working with Bruce Clay Inc through an SEO site audit doing several updates to improve through good seo practices. We haven't made any changes to the site since last Friday, April 13th (maybe the unlucky number has something to do with it....)
Any ideas, insight, suggestions???? Thanks!
-
Nathan please see my response to this question in the Private Q&A section. In short, I don't think this has to do with Google's goof-up a couple weeks ago, or with some of the other shifts, but I do think you are in violation of some of their guidelines on a pretty massive scale, although unintentionally and with good reason. I'll go ahead and paste my answer below, but the Private version includes links and other information that could identify the site.
Hello Nathan,
There were some pretty major shifts on the 16th, 19th and 24th-26th so the time-frame raises a red flag. However, with the little information I have (basically your indexation count) I think you may have a problem that isn't related to any of the recent algorithm shifts.
Assuming you are talking about ####.com, I show 462,000 indexed URLs from that domain. Nearly half a million URLs is quite a bit to index and you can imagine that Google might want to thin that out a bit and focus only on the ones that are important and original.
If you have 460k + pages, but most of them are duplicate content or significantly duplicate content, and most of them have no external links, this would put you in danger of being affected by several different algorithms/filters/penalties put in place by Google to keep such pages from bloating their index and outranking what they think to be "better" content.
That can be a hard pill to swallow because you know your content is good and people like what they find there. But let's look at this from the perspective of an impartial machine...
The following EXACT phrase appears, word-for-word, on about 22,800 different pages, most of them from within your site: "PHRASED REMOVED FOR PUBLIC VIEWING"
The following is typical of the "Related Terminology" section of your pages, which could be interpreted by Google as being keyword spam: "You may have searched any one of these terms to find this product: Keyword1, keyword2, keyword3, keyword4, keyword5, keyword6, keyword7, keyword8, so forth and so-on for a few dozen keywords".
A lot of these pages, possibly most of them, have very little unique/exclusive content. Instead, they list out features and uses from a database. Because of this you have many thousands of pages all with the same potential pool of words, each choosing to show more or less the same words in various orders and combinations. Furthermore, it is obvious that the pages are generated by a machine. Looking at this example, a "better" page would be one that has an introduction telling the visitor what "Polybenzimidazole (PBI) is and what it's used for in paragraph form, in addition to the list of features and uses: http://www.###.com/###/ . I'm sure most of your users will know what PBI is for if they search the site for it, but remember we're dealing with machine algorithms designed to detect spamming attempts, such as article spinning, which uses pretty much the same technique of switching around the order of words to generate hundreds or thousands of "articles" from a single original.
I wouldn't venture to provide specific advice on how to fix these issues without knowing more about your business. My suggestion is to look for a reputable outside SEO agency who can help you overcome these issues, which may involve removing a lot of pages from the index, allowing more content to be seen on each datasheet, or some other measures.
Good luck!
-
I hope they make the same mistake again - our website finally returned to normal positions in Google SERPs that we were at prior to our 404 error page mistake back in Dec. 2011.
Then yesterday, once again we've gone missing and are baffled at why and what to do... We did notice our site:www.ides.com raised from 21,800 pages yesterday to 124,000 today.
-
No, it's Google has work to do, they admitted it was their fault and something went wrong in their classifier.
-
If my site was mistaken for a parked domain.....then I have a lot of work to do. lol
-
It might actually have been Google's fault this time. They misclassified some sites as parked domains and they dropped out of the index. See the Search Engine Land post at http://searchengineland.com/dropped-in-rankings-google-mistake-over-parked-domains-118979.
-
Our representative at Bruce Clay Inc just replied with the following info, so something's up...
"We’re pretty clueless too. The whole Internet went a little crazy yesterday because of a big shakeup in Google’s results, and there are a few theories as to what exactly happened. From the forums it looked like a lot of sites lost all their rankings."
They are looking into our Google Webmaster Tools and Analytics accounts to see if they can pick up any clues.
-
We disappeared for about 16 hours yesterday completely from the SERPS then last night it popped back up and we were ranked higher for all our keywords. I deleted one duplicate page on a directory but I am not sure what transpired to cause the events. Hopefully someone else will have some input and let me know if this is a common occurrence.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing .com and .co.uk site
Hi, I am working on a site that is experiencing indexation problems: To give you an idea, the website should be www.example.com however, Google seems to index www.example.co.uk as well. It doesn’t seem to honour the 301 redirect that is on the co.uk site. This is causing quite a few reporting and tracking issues. This happened the first time in November 2016 and there was an issue identified in the DDOS protection which meant we would have to point www.example.co.uk to the same DNS as www.example.com. This was implemented and made no difference. I cleaned up the htaccess file and this made no difference either. In June 2017, Google finally indexed the correct URL, but I can’t be sure what changed it. I have now migrated the site onto https and www.example.co.uk has been reindexed in Google alongside www.example.com I have been advised that the http needs to be removed from DDOS which is in motion I have also redirected http://www.example.co.uk straight to https://www.example.com to prevent chain redirects I can’t block the site via robot.txt unless I take the redirects off which could mean that I lose my rankings. I should also mention that I haven't actually lost any rankings, it's just replaced some URLs with co.uk and others have remained the same. Could you please advise what further steps I should take to ensure the correct URL’s are indexed in Google?
Technical SEO | | Niki_10 -
Hi! I'm wondering whether for keyword SEO - a url should be www.salshoes.com/shoes/mens/day-wear (so with a few parent categories) or www.salshoes.com/shoes-mens-day-wear is ok for on page optimization?
Hi! I'm wondering whether for keyword SEO - a url should be www.salshoes.com/shoes/mens/day-wear (so with a few parent categories) or www.salshoes.com/shoes-mens-day-wear is ok for on page optimization? Hi! I'm wondering whether for keyword SEO - a url should be www.salshoes.com/shoes/mens/day-wear (so with a few parent categories) or www.salshoes.com/shoes-mens-day-wear is ok for on page optimization?
Technical SEO | | SalSantaCruz0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
How could you make a URL/Breadcrumb structure appear different in Google than when you click into site?
I'm seeing a competitor be able to make their URL/Breadcrumb stucture appear different in Google than on the site. Google shows a 3-4 category silo for the page but once clicked the page is off root. How could you do this?
Technical SEO | | TicketCity0 -
How to display the full structure of website on Google serps
I have been searching around but unable to gather information as to how we can control or list top pages of a website on Google's first page , i.e. if we type seomoz in google , we can see the main listing with 6 subdomain listings , which link to Blog , Seo tool , Beginner Seo guide , Learn Seo , Pricing & Plans and login My question is can we control these listings i.e. what to display and what not , and if yes how can we make this type of visibility on first page , by using html or xml sitemaps or theirs something mostly websites are missing. Cause this type of data is coming up for very less websites and mostly websites are with single urls. c43Ki.jpg
Technical SEO | | ngupta10 -
Redirecting a old aged site to a new exact match site?
Hi All, I have a question. I have 2 sites with me in the same sector and want some help. site 1 is a old site started back in 2003 and has some amount of links to it and has a pr 3 with some good links to it but doesn't rank much for any keywords for the timing. site 2 is a aged domain but newly developed with unique content and has a good amount of exact match with a .com version. so will there be any benefit by redirecting site 1 to site 2 to get the seo benefits and a start for link bulding? or is it best to develop and work on each site? the sector is health insurance. Thanks
Technical SEO | | macky71 -
Does google use the wayback machine to determine the age of a site?
I have a site that I had removed from the wayback machine because I didn't want old versions to show. However I noticed that in many seo tools the site now always shows a domain age of zero instead of 6 years ago when I registered it. My question is what do the actual search engines use to determine age when they factor it into the ranking algorithm? By having it removed from the wayback machine, does that make the search engines think the site is brand new? Thanks
Technical SEO | | FastLearner0