Home page suddenly dropped from index!!
-
A client's home page, which has always done very well, has just dropped out of Google's index overnight!
Webmaster tools does not show any problem. The page doesn't even show up if we Google the company name.The Robot.txt contains:
Default Flywheel robots file
User-agent: *
Disallow: /calendar/action:posterboard/
Disallow: /events/action~posterboard/The only unusual thing I'm aware of is some A/B testing of the page done with 'Optimizely' - it redirects visitors to a test page, but it's not a 'real' redirect in that redirect checker tools still see the page as a 200. Also, other pages that are being tested this way are not having the same problem.
Other recent activity over the last few weeks/months includes linking to the page from some of our blog posts using the page topic as anchor text.
Any thoughts would be appreciated.
Caro -
Woot! So glad to see it wasn't a penalty!
-
Michael,
Duplicate content wasn't the issue in the end, but your response prompted me to analyse their home page text more closely and I discovered that there was room for improvement - too much of the home page content was also present on other pages of the site. Thanks for that!
-
Everyone, this has been resolved! The problem turned out to be a code error in the canonical tag for the page. There was an extra space and slash. Ironically, the canonical tag was one of the first things we looked at, yet we all overlooked that error
Thank you all so much for your input and assistance.
-
Thank you Michael...I'll do that.
-
I've seen a client have an internal page just suddenly be de-indexed. What appears to have happened is that Google saw it as a near duplicate of another page on their site, and dropped it from the index for that reason. Then, magically, it reappeared a week later.
You may be seeing something like this here. See what Moz Pro thinks in terms of duplicate content on your site, and if the home page gets called out along with another page.
-
Thanks so much for that info. I had not heard of Kerboo...I'll definitely check that out right away. Your input has been extremely helpful Kristina.
Caro
-
I would be incredibly surprised if internal links to the homepage caused the issue. Google expects you to have a bunch of internal links to the homepage.
What you're going to need to do now is do a thorough review of all of the external links pointing to your homepage. I would do this with a tool - I recommend Kerboo, although I'm sure there are others that could do the same thing. Otherwise, you can look through all of the links yourself and look for spam indications (steps outlined in this handy Moz article).
Either way, make sure that you pull your list of links from Ahrefs or Majestic. Ideally both, and merge the lists. Moz doesn't crawl nearly as many links.
Since you haven't gotten a manual penalty warning, you're going to have to take as many of the spammy links you find down as you can and disavow the others. For speed, I'd recommend that you immediately upload a list of spammy links with Google's disavow tool, then start asking for an actual removal.
Keep in mind that you're probably going to disavow links that were helping rankings, so expect that your homepage won't come back ranking as well for nonbranded search terms as it used to. You'll probably want to start out uploading a very conservative set of URLs to the disavow tool, wait a couple of days to see if that fixes the problem, upload a bigger set, check, etc.
Good luck!
-
No luck Kristina
I'm wondering if it's an algorithmic penalty in response to back links. We've never done shady linking, but over the years the site has gathered some strange links. Or, is there some chance that about two dozen anchor text links from their blog to the home page could have done it? I deleted them. But I can't request reconsideration if the penalty isn't manual.
-
Any luck so far? Usually it only takes a few hours for Google to crawl new pages after you submit them in GSC, in my experience.
-
I see no serious crawl issues. Mostly things we're already addressing, like duplicate content caused by blog tags and categories, missing meta descriptions (mostly in our knowledge base, so not an issue) and stuff like that.
When I checked the home page alone it said zero high, medium or low priority issues.
The page only de-indexed very recently. Maybe the next crawl will catch something. Same with GSC...it looks like the last 2 days of info is not available yet.
I should mention the home page Optimizely test had been running for at least a week before the page got dropped (will get actual date from client) , plus they have had a product page running a test for weeks with no problem. But I still think your suggestion to pause the test is a good one as I don't want anything to hinder the process of fixing this.
Update: Optimizely has been paused, code removed, home page submitted in GSC.
-
Okay, I ran some tests, and can't see anything that could've gone wrong. That does make it seem like a penalty, but given that this coincided with setting up Optimizely, let's go down that path first.
While your team is taking down the test - have you checked Moz to see if its crawler sees anything that could be causing an issue? I set up my Moz crawler to look into it, but it'll take a few days.
-
Thanks Kristina,
We have not tried pausing the test, but I can request they do that. It may be a good idea to do it regardless of whether it's causing the problem or not, while we get this issue sorted out.
Fetch as Google gave this result:HTTP/1.1 200 OK - so looks ok. I understand this also submits your page to Google as an actual indexing request?
site:https://website.com shows all our pages except the home page.
So, it looks like it's decided not to rank it for some reason.
I deleted some links from the blog to the home page - they had a keyword phrase as the anchor text. There were about 20 links that had accumulated over a few months. Not sure if that's the issue.
Still no manual penalty notice from Google.
-
Hm, I've done a lot with Optimizely in the past, and it's never caused an SEO problem, but it's completely possible something went wrong. Since that's your first inkling, have you tried pausing that test and removing the Optimizely code from the homepage? Then you can determine whether or not it's an Optimizely problem.
Another thing you can do is use the Fetch as Googlebot feature in GSC. Does GSC say it can fetch the page properly?
If it says it can, try searching for "site:www.yourcompanysite.com". This will show if Google's got your URL in its index. If nothing comes up, it's not there; if it comes up, Google's decided not to rank it for some reason.
After those steps, get back to us so we can figure out where to go from there!
Good luck,
Kristina
-
Jordan, not on the original version of the home page, but there is on the B test version.
The way I understand it the B version is a javascript page that is noindexed. Their redirect system seems to leave the original page looking like there is no redirect. Are you suggesting we use a 302 instead? -
Also, Google recommends you 302 those url's instead of returning a 200 http code. You can read more about their best practices about a/b testing.
-
Is there a 'meta no index no follow tag' implemented by chance?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
22 Pages 7 Indexed
So I submitted my sitemap to Google twice this week the first time everything was just peachy, but when I went back to do it again Google only indexed 7 out of 22. The website is www.theinboundspot.com. My MOZ Campaign shows no issues and Google Webmaster shows none. Should I just resubmit it?
Intermediate & Advanced SEO | | theinboundspot1 -
Pages with Duplicate Page Content (with and without www)
How can we resolve pages with duplicate page content? With and without www?
Intermediate & Advanced SEO | | directiq
Thanks in advance.0 -
Enormous 7 page drop after switching servers and adding load balancers. Thoughts?
Hello Everyone, I'm a longtime Moz user but I had to switch accounts after switching jobs. I was hoping someone might be able to give me some insight on whats going on if possible. Our startup had first page position for our most valuable keyword: "Crowdfunding real estate" for about 6 or 7 months. Once we launched and switched to a production server behind load balancers, we dropped almost overnight to 7th page and we've been there for about a month. We don't have many links yet and some of the ones we DO have are kind of spammy (no idea where they came from and in process of trying to get them removed) but we thought it'd be strange to see that massive drop. We are even pages below a competitor who has NO links and basically zero content on the page. We don't have any notifications in WMT about a manual penalty or anything. I'd really, really appreciate any advice and If anyone has any ideas, the page is at: PatchofLand.com Thanks, Jason
Intermediate & Advanced SEO | | PatchofLand0 -
Sudden rankings drop for our most profitable keyword?
Hi, I was hoping you could help me with a troubling issue that has come up this week. We have consistently ranked in the top 3 for the keyword "grey goose glasses" for the last 12 months. We received 74 visits from this keyword from Dec 3 to Jan 3. For some reason on Jan 4th we dropped from the top 3 to somewhere around 40-70 and receive almost no traffic from this keyword an longer. The page that was ranking was http://thebottlemill.com/tbm/grey-goose-drinking-glass.html I have since changes a the Page title and URL with a 301 redirect and the H1 tag to be a better match but this was just done yesterday. We havent seemed to be effected like this on other keywords. Any suggestions would be helpful.
Intermediate & Advanced SEO | | bottlemill0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
404 with a Javascript Redirect to the index page...
I have a client that is wanting me to issue a 404 on her links that are no longer valid to a custom 404, pause for 10 seconds, then rediirect to the root page (or whatever other redirect logic she wants)...to me it seems trying to game googlebot this way is a "bad idea" Can anyone confirm/deny or offer up a better suggestion?
Intermediate & Advanced SEO | | JusinDuff0 -
We are changing ?page= dynamic url's to /page/ static urls. Will this hurt the progress we have made with the pages using dynamic addresses?
Question about changing url from dynamic to static to improve SEO but concern about hurting progress made so far.
Intermediate & Advanced SEO | | h3counsel0