Home page suddenly dropped from index!!
-
A client's home page, which has always done very well, has just dropped out of Google's index overnight!
Webmaster tools does not show any problem. The page doesn't even show up if we Google the company name.The Robot.txt contains:
Default Flywheel robots file
User-agent: *
Disallow: /calendar/action:posterboard/
Disallow: /events/action~posterboard/The only unusual thing I'm aware of is some A/B testing of the page done with 'Optimizely' - it redirects visitors to a test page, but it's not a 'real' redirect in that redirect checker tools still see the page as a 200. Also, other pages that are being tested this way are not having the same problem.
Other recent activity over the last few weeks/months includes linking to the page from some of our blog posts using the page topic as anchor text.
Any thoughts would be appreciated.
Caro -
Woot! So glad to see it wasn't a penalty!
-
Michael,
Duplicate content wasn't the issue in the end, but your response prompted me to analyse their home page text more closely and I discovered that there was room for improvement - too much of the home page content was also present on other pages of the site. Thanks for that!
-
Everyone, this has been resolved! The problem turned out to be a code error in the canonical tag for the page. There was an extra space and slash. Ironically, the canonical tag was one of the first things we looked at, yet we all overlooked that error
Thank you all so much for your input and assistance.
-
Thank you Michael...I'll do that.
-
I've seen a client have an internal page just suddenly be de-indexed. What appears to have happened is that Google saw it as a near duplicate of another page on their site, and dropped it from the index for that reason. Then, magically, it reappeared a week later.
You may be seeing something like this here. See what Moz Pro thinks in terms of duplicate content on your site, and if the home page gets called out along with another page.
-
Thanks so much for that info. I had not heard of Kerboo...I'll definitely check that out right away. Your input has been extremely helpful Kristina.
Caro
-
I would be incredibly surprised if internal links to the homepage caused the issue. Google expects you to have a bunch of internal links to the homepage.
What you're going to need to do now is do a thorough review of all of the external links pointing to your homepage. I would do this with a tool - I recommend Kerboo, although I'm sure there are others that could do the same thing. Otherwise, you can look through all of the links yourself and look for spam indications (steps outlined in this handy Moz article).
Either way, make sure that you pull your list of links from Ahrefs or Majestic. Ideally both, and merge the lists. Moz doesn't crawl nearly as many links.
Since you haven't gotten a manual penalty warning, you're going to have to take as many of the spammy links you find down as you can and disavow the others. For speed, I'd recommend that you immediately upload a list of spammy links with Google's disavow tool, then start asking for an actual removal.
Keep in mind that you're probably going to disavow links that were helping rankings, so expect that your homepage won't come back ranking as well for nonbranded search terms as it used to. You'll probably want to start out uploading a very conservative set of URLs to the disavow tool, wait a couple of days to see if that fixes the problem, upload a bigger set, check, etc.
Good luck!
-
No luck Kristina
I'm wondering if it's an algorithmic penalty in response to back links. We've never done shady linking, but over the years the site has gathered some strange links. Or, is there some chance that about two dozen anchor text links from their blog to the home page could have done it? I deleted them. But I can't request reconsideration if the penalty isn't manual.
-
Any luck so far? Usually it only takes a few hours for Google to crawl new pages after you submit them in GSC, in my experience.
-
I see no serious crawl issues. Mostly things we're already addressing, like duplicate content caused by blog tags and categories, missing meta descriptions (mostly in our knowledge base, so not an issue) and stuff like that.
When I checked the home page alone it said zero high, medium or low priority issues.
The page only de-indexed very recently. Maybe the next crawl will catch something. Same with GSC...it looks like the last 2 days of info is not available yet.
I should mention the home page Optimizely test had been running for at least a week before the page got dropped (will get actual date from client) , plus they have had a product page running a test for weeks with no problem. But I still think your suggestion to pause the test is a good one as I don't want anything to hinder the process of fixing this.
Update: Optimizely has been paused, code removed, home page submitted in GSC.
-
Okay, I ran some tests, and can't see anything that could've gone wrong. That does make it seem like a penalty, but given that this coincided with setting up Optimizely, let's go down that path first.
While your team is taking down the test - have you checked Moz to see if its crawler sees anything that could be causing an issue? I set up my Moz crawler to look into it, but it'll take a few days.
-
Thanks Kristina,
We have not tried pausing the test, but I can request they do that. It may be a good idea to do it regardless of whether it's causing the problem or not, while we get this issue sorted out.
Fetch as Google gave this result:HTTP/1.1 200 OK - so looks ok. I understand this also submits your page to Google as an actual indexing request?
site:https://website.com shows all our pages except the home page.
So, it looks like it's decided not to rank it for some reason.
I deleted some links from the blog to the home page - they had a keyword phrase as the anchor text. There were about 20 links that had accumulated over a few months. Not sure if that's the issue.
Still no manual penalty notice from Google.
-
Hm, I've done a lot with Optimizely in the past, and it's never caused an SEO problem, but it's completely possible something went wrong. Since that's your first inkling, have you tried pausing that test and removing the Optimizely code from the homepage? Then you can determine whether or not it's an Optimizely problem.
Another thing you can do is use the Fetch as Googlebot feature in GSC. Does GSC say it can fetch the page properly?
If it says it can, try searching for "site:www.yourcompanysite.com". This will show if Google's got your URL in its index. If nothing comes up, it's not there; if it comes up, Google's decided not to rank it for some reason.
After those steps, get back to us so we can figure out where to go from there!
Good luck,
Kristina
-
Jordan, not on the original version of the home page, but there is on the B test version.
The way I understand it the B version is a javascript page that is noindexed. Their redirect system seems to leave the original page looking like there is no redirect. Are you suggesting we use a 302 instead? -
Also, Google recommends you 302 those url's instead of returning a 200 http code. You can read more about their best practices about a/b testing.
-
Is there a 'meta no index no follow tag' implemented by chance?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webshop landing pages and product pages
Hi, I am doing extensive keyword research for the SEO of a big webshop. Since this shop sells technical books and software (legal books, tax software and so on), I come across a lot of very specific keywords for separate products. Isn't it better to try and rank in the SERP's with all the separate product pages, instead of with the landing (category) pages?
Intermediate & Advanced SEO | | Mat_C0 -
I'm noticing that URL that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before?
I'm noticing that URLs that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before? Here's an example:
Intermediate & Advanced SEO | | nystromandy
http://www.thefader.com/2017/01/11/the-carter-documentary-lil-wayne-black-lives-matter0 -
Google Webmaster Tools -> Sitemap suddent "indexed" drop
Hello MOZ, We had an massive SEO drop in June due to unknown reasons and we have been trying to recover since then. I've just noticed this yesterday and I'm worried. See: http://imgur.com/xv2QgCQ Could anyone help by explaining what would cause this sudden drop and what does this drop translates to exactly? What is strange is that our index status is still strong at 310 pages, no drop there: http://imgur.com/a1sRAKo And when I do search on google site:globecar.com everything seems normal see: http://imgur.com/O7vPkqu Thanks,
Intermediate & Advanced SEO | | GlobeCar0 -
Pagination on a product page with reviews spread out on multiple pages
Our current product pages markup only have the canonical URL on the first page (each page loads more user reviews). Since we don't want to increase load times, we don't currently have a canonical view all product page. Do we need to mark up each subsequent page with its own canonical URL? My understanding was that canonical and rel next prev tags are independent of each other. So that if we mark up the middle pages with a paginated URL, e.g: Product page #1http://www.example.co.uk/Product.aspx?p=2692"/>http://www.example.co.uk/Product.aspx?p=2692&pageid=2" />**Product page #2 **http://www.example.co.uk/Product.aspx?p=2692&pageid=2"/>http://www.example.co.uk/Product.aspx?p=2692" />http://www.example.co.uk/Product.aspx?p=2692&pageid=3" />Would mean that each canonical page would suggest to google another piece of unique content, which this obviously isn't. Is the PREV NEXT able to "override" the canonical and explain to Googlebot that its part of a series? Wouldn't the canonical then be redundant?Thanks
Intermediate & Advanced SEO | | Don340 -
Alternative HTML Structure for indexation of JavaScript Single Page Content
Hi there, we are currently setting up a pure html version for Bots on our site amazine.com so the content as well as navigation will be fully indexed by google. We will show google exactly the same content the user sees (except for the fancy JS effects). So all bots get pure html and real users see the JS based version. My questions are first, if everyone agrees that this is the way to go or if there are alternatives to this to get the content indexed. Are there best practices? All JS-based websites must have this problem, so I am hoping someone can share their experience. The second question regards the optimal number of content pieces ('Stories') displayed per page and the best method to paginate. Should we display e.g. 10 stories and use ?offset in the URL or display 100 stories to google per page and maybe use rel=”next”/"pref" instead. Generally, I would really appreciate any pointers and experiences from you guys as we haven't done this sort of thing before! Cheers, Frank
Intermediate & Advanced SEO | | FranktheTank-474970 -
Old pages still in index
Hi Guys, I've been working on a E-commerce site for a while now. Let me sum it up : February new site is launched Due to lack of resources we started 301's of old url's in March Added rel=canonical end of May because of huge index numbers (developers forgot!!) Added noindex and robots.txt on at least 1000 urls. Index numbers went down from 105.000 tot 55.000 for now, see screenshot (actual number in sitemap is 13.000) Now when i do site:domain.com there are still old url's in the index while there is a 301 on the url since March! I know this can take a while but I wonder how I can speed this up or am doing something wrong. Hope anyone can help because I simply don't know how the old url's can still be in the index. 4cArHPH.png
Intermediate & Advanced SEO | | ssiebn70 -
How many links home on a page?
We are planning on a mega menu which will have around 300 links and a mega slider which will have around 175 links if our developer has their way. In all I could be looking at over 500 links from the home page. The Mega Menu will flatten the site link structure out but I am worried this slider on the home page which is our 4th most visited page behind our 3 core category pages. What are your thoughts?
Intermediate & Advanced SEO | | robertrRSwalters0 -
E Commerce product page canonical and indexing + URL parameters
Hi, I'm having some issues on the best way to handle site structure. The technical side of SEO isn't my strong point so I thought I'd ask the question before I make the decision. Two examples for you to look at. This is a new site http://www.tester.co.uk/electrical/multimeters/digital. By selecting another page to see more products you get this url string where/p/2. This page also has the canonical tag relating to this page and not the original page. Now if say for example I exclude this parameter (where) in webmaster tools will I be stopping Google indexing the products on the other pages where/p/2, 3, 4 etc. and the same if I make the canonical point to multimeters/digital/ instead of multimeters/digital/where/p/2 etc.? I have the same question applied to the older site http://www.pat-services.co.uk/digital-multimeters-26.html. which no longer has an canonical tags at all. The only real difference is Google is indexing http://www.pat-services.co.uk/digital-multimeters-26.html?page=2 but not http://www.tester.co.uk/electrical/multimeters/digital/where/p/2 Thanks for help in advance
Intermediate & Advanced SEO | | PASSLtd0