Google dropping pages from SERPs even though indexed and cached. (Shift over to https suspected.)
-
Anybody know why pages that have previously been indexed - and that are still present in Google's cache - are now not appearing in Google SERPs?
All the usual suspects - noindex, robots, duplication filter, 301s - have been ruled out. We shifted our site over from http to https last week and it appears to have started then, although we have also been playing around with our navigation structure a bit too.
Here are a few examples...
Example 1:
- Live URL: https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place
- Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place
- SERP (1): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place
- SERP (2): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place+site%3Awww.normanrecords.com
Example 2:
- SERP: https://www.google.co.uk/search?q=deaf+center+recount+site%3Awww.normanrecords.com
- Live URL: https://www.normanrecords.com/records/149001-deaf-center-recount-
- Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149001-deaf-center-recount-
These are pages that have been linked to from our homepage (Moz PA of 68) prominently for days, are present and correct in our sitemap (https://www.normanrecords.com/catalogue_sitemap.xml), have unique content, have decent on-page optimisation, etc. etc.
We moved over to https on 11 Aug. There were some initial wobbles (e.g. 301s from normanrecords.com to www.normanrecords.com got caught up in a nasty loop due to the conflicting 301 from http to https) but these were quickly sorted (i.e. spotted and resolved within minutes). There have been some other changes made to the structure of the site (e.g. a reduction in the navigation options) but nothing I know of that would cause pages to drop like this.
For the first example (Memory Drawings) we were ranking on the first page right up until this morning and have been receiving Google traffic for it ever since it was added to the site on 4 Aug.
Any help very much appreciated! At the very end of my tether / understanding here...
Cheers,
Nathon
-
Sorry denverish, I have been really busy lol
Point number 3 refers to "pinging" your website, to let the search engines know your site has been updated. There are a few spammy ones out there, but I would try using:
http://pingomatic.com/
or
https://pingler.com/Pinging is a process by which you can inform major Search Engines and RSS direcotries(Google,Bing,Yahoo,Technorati etc) that you have updated content/URL's in your blog/Website. Pinging can also help in getting website changes indexed quickly.
-
Hi David, did you see Nathon's most recent response? Just checking.
-
Hi David,
Thanks for your response!
Yup, our sitemaps are recreated and resubmitted every day, and those pages have both been fetched (and rendered) in GWT. Not sure exactly what you mean by point 3 though??
-
Even though the 301's are all in place, have you notified Google of the changes? Might be why you are seeing them disappear
1. Sitemap recreation and resubmission
2. Resubmit via Webmaster tools using "fetch as Google"
3. Ping website for new pages
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does Google's search results display my home page instead of my target page?
Why does Google's search results display my home page instead of my target page?
Technical SEO | | h.hedayati6712365410 -
How to stop google from indexing specific sections of a page?
I'm currently trying to find a way to stop googlebot from indexing specific areas of a page, long ago Yahoo search created this tag class=”robots-nocontent” and I'm trying to see if there is a similar manner for google or if they have adopted the same tag? Any help would be much appreciated.
Technical SEO | | Iamfaramon0 -
Should I remove these pages from the Google index?
Hi there, Please have a look at the following URL http://www.elefant-tours.com/index.php?callback=imagerotator&gid=65&483. It's a "sitemap" generated by a Wordpress plug-in called NextGen gallery and it maps all the images that have been added to the site through this plugin, which is quite a lot in this case. I can see that these "sitemap" pages have been indexed by Google and I'm wondering whether I should remove these or not? In my opinion these are pages that a search engine would never would want to serve as a search result and pages that a visitor never would want to see. Attracting any traffic through Google images is irrelevant in this case. What is your advice? Block it or leave it indexed or something else?
Technical SEO | | Robbern0 -
Home page not indexed by any search engines
We are currently having an issue with our homepage not being indexed by any search engines. We recently transferred our domain to Godaddy and there was an issue with the DNS. When we typed our url into Google like this "https://www.mysite.com" nothing from the site came up in the search results, only our social media profiles. When we typed our url into Google like this "mysite.com" we were sent to a GoDaddy parked page. We've been able to fix the issue over at Godaddy and the url "mysite.com" is not being redirected to "https://mysite.com" but, Google and the other search engines have yet to respond. I would say our fix has been in place for at least 72 hours. Do I need to give this more time? I would think that at lease one search engine would have picked up on the change by now and would start indexing the site properly.
Technical SEO | | bcglf1 -
Rich Snippets for recipe pages not appearing in Google
We are building a baking website and have implemented rich snippets for our recipe posts. We noticed inconsistent results on competitor sites, and then noticed it was happening to our links as well. Our content has only been live for a week, I know it may take a couple weeks, but other sites that have had their content around for a while have this happening too. For example: When you use this tool: http://www.google.com/webmasters/tools/richsnippets And put in this link (competitor): http://food52.com/recipes/864-deep-chocolate-cake-with-orange-icing and press "Preview," you'll see a nice rich snippet preview. Now go ahead and search for "Deep Chocolate Cake with Orange Icing" using Google, you will see that in the search results the image for this link is not appearing. This is happening to all of our links as well. Why? We are using the schema recipe format, but apparently that doesn't guarantee the image will appear in the actual search results. How does Google determine which images are displayed in rich snippets and which aren't?
Technical SEO | | bakepedia0 -
GWT indexing wrong pages
Hi SEOMoz I have a listings site. In a part of the page, I have 3 comboboxes, for state, county and city. On the change event, the javascript redirects the user to the page of the selected location. Parameters are passed via GET, and my URL is rewrited via htaccess. Example: http:///www.site.com/state/county/city.html The problem is, there is A LOT(more than 10k) of 404 errors. It is happenning because the crawler is trying to index the pages, sometimes WITHOUT a parameter, like http:///www.site.com/state//city.html I don't know how to stop it, and I don't wanna remove it, once it's very clicked by the users. What should I do?
Technical SEO | | elias990 -
How can I get unimportant pages out of Google?
Hi Guys, I have a (newbie) question, untill recently I didn't had my robot.txt written properly so Google indexed around 1900 pages of my site, but only 380 pages are real pages, the rest are all /tag/ or /comment/ pages from my blog. I now have setup the sitemap and the robot.txt properly but how can I get the other pages out of Google? Is there a trick or will it just take a little time for Google to take out the pages? Thanks! Ramon
Technical SEO | | DennisForte0 -
Google Dmoz description in SERPS
My dmoz description is not as KW rich as my sites normal description. IS there an advantage or disadvantage to either? If so, How do I prevent google from doing this?
Technical SEO | | DavidS-2820610