[wtf] Mysterious Homepage De-Indexing
-
Our homepage, as well as several similar landing pages, have vanished from the index. Could you guys review the below pages to make sure I'm not missing something really obvious?!
URLs: http://www.grammarly.com http://www.grammarly.com/plagiarism-checker
- It's been four days, so it's not just a temporary fluctuation
- The pages don't have a "noindex" tag on them and aren't being excluded in our robots.txt
- There's no notification about a penalty in WMT
Clues:
-
WMT is returning an "HTTP 200 OK" for Fetch, is showing a redirect to grammarly.com/1 (alternate version of homepage, contains rel=canonical back to homepage) for Fetch+Render. Could this be causing a circular redirect?
-
Some pages on our domain are ranking fine, e.g. https://www.google.com/search?q=grammarly+answers
-
A month ago, we redesigned the pages in question. The new versions are pretty script-heavy, as you can see.
-
We don't have a sitemap set up yet.
Any ideas? Thanks in advance, friends!
-
Did this get resolved? I'm seeing your home-page indexed and ranking now.
I'm not seeing any kind of redirect to an alternate URL at this point (either as a browser or as GoogleBot). If you 301'ed to an alternate URL and then rel=canonical'ed back to the source of the 301, that could definitely cause problems. It's sending a pretty strong mixed-signal. In that case you'd probably want to 302 or use some alternate method. Redirects for the home-page are best avoided, in most cases.
-
Are you sure it was missing for a time? Ultimately I wouldn't use a third-party (Google) as a tool to diagnose problems (faulty on-site code) that I know are problems and need to be fixed.I'd fix the problems I know are issues and then go from there. Or hire someone capable of fixing the problems.
-
Thanks, Ryan. I'll get to work on the issues you mentioned.
I do have one question for you - grammarly.com/proofreading (significantly fewer links, identical codebase) is now back on the index. If the issue was too many scripts or HTML errors, wouldn't both pages still be de-indexed?
-
Here are some issues just going down the first few lines of code...
- There's a height attribute in your tag.
- Your cookie on the home page is set to expire in the past, not the future
- Your tag conflicts with your script and other code issues (http://stackoverflow.com/questions/21363090/doctype-html-ruins-my-script)
- Your Google Site Verification meta tag is different than other pages.
- Your link to the Optimizely CDN is incorrect... (missing 'http:' so it's looking for the script on your site)
- You have many other Markup Issues.
And that's prior to getting into the hundreds of lines of code preceding the start of your page at the tag... 300 lines or so on your other indexed pages 1100+ on your home page. So not only are you not following best practices as outlined by Google, but you have broken stuff too.
-
The saga continues...
According to WMT, there are no issues with grammarly.com The page is fetched and rendered correctly.
Google! Y u no index? Any ideas?
-
Like Lynn mentioned below, if you're having redirection take place across several portions of the site, that could cause the spikes, and a big increase in total download time is worrying if you're crossing the average bounce rate threshold for most people's patience.
Here's the Google Page speed take on it: https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fgrammarly.com&tab=desktop. They go over both desktop and mobile.
-
Hmm, was something done to fix the googlebot redirect issue or did it just fix itself? Here it states that googlebot will often identify itself as mozilla and your fetch/render originally seemed to indicate that at least some of the time that was the page google was getting. It is a bit murky technically what exactly is going on there but if google is getting redirected some of the time then as you said you are getting into a circular situation between the redirect and the canonical where it is a bit difficult to predict what will happen. If that is 100% fixed now and google sees the main page all the time then I would wait a day or two to see if the page comes back into the index (but be 100% sure that you know it is fixed!). I still think that is the most likely source of your troubles...
-
Excellent question, Lynn. Thank you for chiming in here. There's a user agent based javascript redirect that keeps Chrome visitors on grammarly.com (Chrome browser extension) and sends other browsers to grammarly.com/1 (Web app that works on all browsers).
UPDATE: According to WMT Fetch+Render, the Googlebot redirection issue has been fixed. It is no longer being redirected anywhere and returning a 200 OK for grammarly.com.
Kelly, if that was causing the problem, how long should I hold my breath for re-indexing after re-submitting the homepage?
-
Yup definitely. Whether you're completely removed or simply dropped doesn't matter. If you're not there anymore, for some reason Google determined you're no longer an authority for that keyword. So you need to find out why. Since you just redesigned, the way way is to back track, double check all the old tags and compare them to the new site, check the text and keyword usage on the website, look for anything that's changed that could contribute to the drop. If you don't find anything, tools like majesticSEO are handy to checking if your backlinks are still healthy.
-
Hi Alex, Thank you for your response. The pages didn't suffer in ranking, they were completely removed from the index. Based on that, do you still think it could be a keyword issue?
-
That's actually a great point. I suppose Google could have been holding on to a pre-redesign cached version of the pages.
There has been a 50-100% increase in page download times as well as some weird 5x spikes for crawled pages. I know there could probably be a million different reasons, but do any of them stick out at you as being potential sources of the problem?
-
How does that second version of the homepage work and how long has it been around for? I get one version of the homepage in one browser and the second in another, what decides which version is served and what kind of redirect is it? I think that is the most likely source of your troubles.
-
Yes, but the pages were indexed prior to the redesign, no? Can you look up your crawl stats in GWT to see if there's been a dramatic up tick in page download times, and a down trend in pages crawled. That will at least give you a starting point as to differences between now and then: https://www.google.com/webmasters/tools/crawl-stats
-
Logo definitely needs to be made clickable to Home.
Did you compare the old design and the new design's text to make sure you're still covering the same keywords. In many cases a redesign is more "streamlined" which also means less text or a re-write which is going to impact the keywords your site is relevant for.
-
Thanks, Ryan. Improving our code-to-text ratio is on our roadmap, but could that really be the issue here? The pages were all fully indexed without problems for a full month after our redesign, and we haven't added any scripts. Was there an algorithm update on Monday that could explain the sudden de-indexing?
-
VERY script heavy. Google has recently released updates on a lot of this (Q4 2014) here: http://googlewebmastercentral.blogspot.mx/2014/10/updating-our-technical-webmaster.html. With further guidance given here: https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/optimize-encoding-and-transfer. Without doing a deep dive that's the most glaring issue and obvious difference between pages that are still being indexed and those that are not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why did Google cache & index a different domain than my own?
We own www.homemenorca.com, a real estate website based in Spain. Pages from this domain are not being indexed: https://www.google.com/search?q=site%3Awww.homemenorca.com&oq=site%3Awww.homemenorca.com&aqs=chrome..69i57j69i58j69i59l2.3504j0j7&sourceid=chrome&ie=UTF-8Please notice that the URLs are Home Menorca, but the titles are not Home Menorca, they are Fincas Mantolan, a completely different domain and company: http://www.fincasmantolan.com/. Furthermore, when we look at Google's cache of Home Menorca, we see a different website: http://webcache.googleusercontent.com/search?q=cache%3Awww.homemenorca.com%2Fen&oq=cache%3Awww.homemenorca.com%2Fen&aqs=chrome..69i57j69i58j69i59.1311j0j4&sourceid=chrome&ie=UTF-8We reviewed Google Search Console, Google Fetch, the canonical tags, the XML sitemap, and many more items. Google Search Console accepted our XML sitemap, but is only indexing 5-10% of the pages. Google is fetching and rendering the pages properly. However, we are not seeing the correct content being indexed in Google. We have seen issues with page loading times, loading content longer than 4 seconds, but are unsure why Google would be indexing a different domain.If you have suggestions or thoughts, we would very much appreciate it.Additional Language Issue:When a user searches "Home Menorca" from America or the UK with "English" selected in their browser as their default language, they are given a Spanish result. It seems to have accurate hreflang annotations within the head section on the HTML pages, but it is not working properly. Furthermore, Fincas Mantolan's search result is listed immediately below Home Menorca's Spanish result. We believe that if we fix the issue above, we will also fix the language issue. Please let us know any thoughts or recommendations that can help us. Thank you very much!
Intermediate & Advanced SEO | | CassG12340 -
How do the Quoras of this world index their content?
I am helping a client index lots and lots of pages, more than one million pages. They can be seen as questions on Quora. In the Quora case, users are often looking for the answer on a specific question, nothing else. On Quora there is a structure setup on the homepage to let the spiders in. But I think mostly it is done with a lot of sitemaps and internal linking in relevancy terms and nothing else... Correct? Or am I missing something? I am going to index about a million question and answers, just like Quora. Now I have a hard time dealing with structuring these questions without just doing it for the search engines. Because nobody cares about structuring these questions. The user is interested in related questions and/or popular questions, so I want to structure them in that way too. This way every question page will be in the sitemap, but not all questions will have links from other question pages linking to them. These questions are super longtail and the idea is that when somebody searches this exact question we can supply them with the answer (onpage will be perfectly optimised for people searching this question). Competition is super low because it is all unique user generated content. I think best is just to put them in sitemaps and use an internal linking algorithm to make the popular and related questions rank better. I could even make sure every question has at least one other page linking to it, thoughts? Moz, do you think when publishing one million pages with quality Q/A pages, this strategy is enough to index them and to rank for the question searches? Or do I need to design a structure around it so it will all be crawled and each question will also receive at least one link from a "category" page.
Intermediate & Advanced SEO | | freek270 -
How to check if the page is indexable for SEs?
Hi, I'm building the extension for Chrome, which should show me the status of the indexability of the page I'm on. So, I need to know all the methods to check if the page has the potential to be crawled and indexed by a Search Engines. I've come up with a few methods: Check the URL in robots.txt file (if it's not disallowed) Check page metas (if there are not noindex meta) Check if page is the same for unregistered users (for those pages only available for registered users of the site) Are there any more methods to check if a particular page is indexable (or not closed for indexation) by Search Engines? Thanks in advance!
Intermediate & Advanced SEO | | boostaman0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Incorrect cached page indexing in Google while correct page indexes intermittently
Hi, we are a South African insurance company. We have a page http://www.miway.co.za/midrivestyle which has a 301 redirect to http://www.miway.co.za/car-insurance. Problem is that the former page is ranking in the index rather than the latter. The latter page does index occasionally in the same position, but rarely. This is primarily for search phrases like "car insurance" and "car insurance quotes". The ranking was knocked down the index with Penquin 2.0. It was not ranking at all but we have managed to recover to 12/13. This abnormally has only been occurring since the recovery. The correct page does index for other search terms like "insurance for car". Your help would be appreciated, thanks!
Intermediate & Advanced SEO | | miway0 -
Google is Really Slow to Index my New Website
(Sorry for my english!) A quick background: I had a website at thewebhostinghero.com which had been slapped left and right by Google (both Panda & Penguin). It also had a manual penalty for unnatural links which had been lifted in late april / early may this year. I also had another domain, webhostinghero.com, which was redirecting to thewebhostinghero.com. When I realized I would be better off starting a new website than trying to salvage thewebhostinghero.com, I removed the redirection from webhostinghero.com and started building a new website. I waited about 5 or 6 weeks before putting any content on webhostinghero.com so Google had time to notice that the domain wasn't redirecting anymore. So about a month ago, I launched http://www.webhostinghero.com with 100% new content but I left thewebhostinghero.com online because it still brings a little (necessary) income. There are no links between the websites except on one page (www.thewebhostinghero.com/speed/) which is set to "noindex,nofollow" and is disallowed to search engines in robots.txt. I made sure the web page was deindexed before adding a "nofollow" link from thewebhostinghero.com/speed => webhostinghero.com/speed Since the new website launch, I've been publishing new content (from 2 to 5 posts) daily. It's getting some traction from social networks but it gets barely any clicks from Google search. It seems to take at least a week before Google indexes new posts and not all posts are indexed. The cached copy of the homepage is 12 days old. In Google Webmaster Tools, it looks like Google isn't getting the latest sitemap version unless I resubmit it manually. It's always 4 or 5 days old. So is my website just too young or could it have some kind of penalty related to the old website? The domain has 4 or 5 really old spammy links from the previous domain owner which I couldn't get rid of but otherwise I don't think there's anything tragic.
Intermediate & Advanced SEO | | sbrault740 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
Redirect index.php to domain address
The question says it all really. Google analytics shows me that my top 2 pages are my domain address and index.php, which are exactly the same. Is it best to leave it like this or redirect index.php to my domain address?
Intermediate & Advanced SEO | | CompleteOffice0