Index dropped 20 pages at once since yesterday
-
Hi community,
I just realized that my indexed pages dropped from the amount of 95 to 75 and I don't know why. I did some title tag arrangements because we are launching with our first product (before that it was just a blog). I did these changes 1 week ago and fetched to google the homepage and some subdomains.
Thanks for your help.
Kind regards
Marco
-
Excellent news - glad it has all returned
-Andy
-
Everything is back to normal.
Thanks again, Andy.
-
No problem at all
-
Andy, thanks so much for that piece of quality content!
-
I think this might explain your issue...
Gary Illyes @methode
Bad news: we might have a problem with reporting the number of indexed URLs in the SC Sitemaps feature. Good news: we're looking into it
Soooo, it looks like the number of indexed pages in Webmaster Tools (Search Console) is being reported incorrectly.
That would explain what you are seeing
-Andy
-
Thanks a lot Andy,
maybe it has something to do with "fetching as google", which I did around 1 week ago with they main homepage as I changed some major keywords (titel tag) and moved them to other sites (blog categories etc) when I added our first product/collection.
Nevermind - time will tell
Thanks again!
-
You aren't disturbing Marco
It really isn't uncommon to see discrepancies like this. I see them every day! A 25% drop like this suggests to me that Google is perhaps doing a little bit of a reshuffling.
I would wait just a bit to see if the number of pages starts to increase as there isn't an awful lot else you can do - it sounds like it has all been done.
-Andy
-
That is what I did around 7 days ago, so before the drop of index.
-
Try to identiy pages that has more links and use "fetch as google" and chose to re-crawl the URL.
-
Thanks for the tip - I will do that. But I don't understand how this has something to do with this issue. At the moment all search engines show my sites ranking (so probably indexed). Just Search Console is giving me some wrong(?) information. I don't think there can be a discrepancy in a way that Google Search Console shows me de-indexing in advance of google serp.
Cheers
-
did you also verify your site under bing/yahoo webmaster tools. if not yet i suggest you do you will be surprised how faster and how effective is bing yahoo indexation. wait for couples of days en perform the site: www.domain.com you will see how index differ with google
-
Hi,
thanks for your help. "site:domain"-check in bing and yahoo show the exact same as google. So all the pages are currently in serps.
-
do similar check on bing and yahoo. they are much more effective and speedy than google but problem is market share that google have
-
Hi Andy,
sorry for disturbing, but I just did a check with "site:paleotogo.de" in google search and it found all pages.
In Search Console it tells me my blog has 54 out of 75 indexed (this was the sudden drop I spoke about earlier). But when I look further into the blog sitemap it shows me all 75 pages when I count them.
What is happening here? I really don't understand
Cheers, Marco
-
No problem at all. Just update here if it hasn't rectified itself soon and we can take another look
-Andy
-
Thanks for your help, Andy - really appreciate it. Hope everything will turn out well.
Just realized I forgot to mention the page it is about: www.paleotogo.de
-
Ah sorry, I missed that bit Marco.
When Google drop pages from the index, it can be for a whole host of reasons. However, Google never indexed 100% of pages (or very, very rarely). If you were at 95% and now at 75%, then this would suggest to me that Google has either lost some level of trust in the pages or you will just have to wait until the pages are re-indexed and Google has decided what to do with them again.
I would be tempted to wait for a little bit as over time, you should see pages being re-indexed again. If you have already re-submitted the sitemap, just make sure there is no problem with that - rebuild it and then re-submit if you haven't already, just to be sure.
-Andy
-
Hi Andy,
it's not about the drop of a specific keyword, it's about the drop of indexed pages going from 100% indexed to around 75% indexed.
Cheers
Marco
-
Hi Moosa,
thanks for your help! I did a resubmit 2 hours ago and just checked again, but it didn't index all of the pages (20 left again).
Strange.
-
Hi Marco,
It all depends on what you were tracking as phrases before, and now after you have made the changes. If you were tracking for the phrase "Red Sneakers" and changed the title to "Blue Sneakers" then you would expect to see a drop with your original phrase. Have you updated your tracking to compensate for the changes?
-Andy
-
Ideally just by changing the title tags the index should not be dropped but if this is something you have noticed, you can always update your sitemap.xml and resubmit it to Google and in the next crawl it should be indexed again.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
On-Page Optimization | | Jacksons_Fencing0 -
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
How to schedule the on page reports myself
The on page reports are scheduled on mondays, but is there a way to schedule it my self.
On-Page Optimization | | JoostBruining0 -
Too Many On-Page Links
Hi, I did a SEOmoz campaign and got results today, One of the results is Too "Many On-Page Links" when i am drilling down, i see that that's include inside links. for example, i sale food, i have my main department window - inside i have 30 products - each product is linked to a detailed page about the product. so automatically i have 30 links - not including all the others in this page, and i easily get over 100 and even sometimes 200 is this a big issue? does it damages my SEO? If yes, is there a way to write the HTML in a way that internal links like that wont be counted? Thank you SEOWiseUs
On-Page Optimization | | iivgi0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Is reported duplication on the pages or their canonical pages?
There are several sections getting flagged for duplication on one of our sites: http://mysite.com/section-1/?something=X&confirmed=true
On-Page Optimization | | Safelincs
http://mysite.com/section-2/?something=X&confirmed=true
http://mysite.com/section-3/?something=X&confirmed=true Each of the above are showing as having duplicates of the other sections. Indeed, these pages are exactly the same (it's just an SMS confirmation page you enter your code in), however, they all have canonical links back to the section (without the query string), i.e. section-1, section-2 and section-3 respectively. These three sections have unique content and aren't flagged up for duplications themselves, so my questions are: Are the pages with the query strings the duplicates, and if so why are the canonical links being ignored? or Are the canonical pages without the query strings the duplicates, and if so why don't they appear as URLs in their own right in the duplicate content report? I am guessing it's the former, but I can't figure out why it would ignore the canonical links. Any ideas? Thanks0 -
Only 1 Page Being Crawled
I have a website I'm tracking www.alhi.com. But my report is saying that only 1 page is being crawled each update. My campaign is set up for the sub domain www.alhi.com, so I'm not sure why I would have this issue. Can you help? Thanks!
On-Page Optimization | | LeslieVS0 -
Web Page Refresh
Hi there, we redesign our Website, changing it for a jquery based version. This new design is much more usable and nice for our users, however the average page views for user decreased a lot. Basically this is due to the fact that once the user is logged in, it spends most of the time in the same Web form which is updated through jquery without refreshing it. We were thinking about adding a meta refresh tag, or ad some javascript for getting this task done in order to get the relation page views/visitor increased. Do you think refreshing the page every 4 minutes could be penalized by Google (or other Search engines) ? Which should be the interval between refresh ? Would it be better to make it very explicit (i.e. adding a meta refresh tag) or using a kind of hide javascript ? We want to increase the pageviews but of course, we don't want to get penalized
On-Page Optimization | | martincad0