Index dropped 20 pages at once since yesterday
-
Hi community,
I just realized that my indexed pages dropped from the amount of 95 to 75 and I don't know why. I did some title tag arrangements because we are launching with our first product (before that it was just a blog). I did these changes 1 week ago and fetched to google the homepage and some subdomains.
Thanks for your help.
Kind regards
Marco
-
Excellent news - glad it has all returned
-Andy
-
Everything is back to normal.
Thanks again, Andy.
-
No problem at all
-
Andy, thanks so much for that piece of quality content!
-
I think this might explain your issue...
Gary Illyes @methode
Bad news: we might have a problem with reporting the number of indexed URLs in the SC Sitemaps feature. Good news: we're looking into it
Soooo, it looks like the number of indexed pages in Webmaster Tools (Search Console) is being reported incorrectly.
That would explain what you are seeing
-Andy
-
Thanks a lot Andy,
maybe it has something to do with "fetching as google", which I did around 1 week ago with they main homepage as I changed some major keywords (titel tag) and moved them to other sites (blog categories etc) when I added our first product/collection.
Nevermind - time will tell
Thanks again!
-
You aren't disturbing Marco
It really isn't uncommon to see discrepancies like this. I see them every day! A 25% drop like this suggests to me that Google is perhaps doing a little bit of a reshuffling.
I would wait just a bit to see if the number of pages starts to increase as there isn't an awful lot else you can do - it sounds like it has all been done.
-Andy
-
That is what I did around 7 days ago, so before the drop of index.
-
Try to identiy pages that has more links and use "fetch as google" and chose to re-crawl the URL.
-
Thanks for the tip - I will do that. But I don't understand how this has something to do with this issue. At the moment all search engines show my sites ranking (so probably indexed). Just Search Console is giving me some wrong(?) information. I don't think there can be a discrepancy in a way that Google Search Console shows me de-indexing in advance of google serp.
Cheers
-
did you also verify your site under bing/yahoo webmaster tools. if not yet i suggest you do you will be surprised how faster and how effective is bing yahoo indexation. wait for couples of days en perform the site: www.domain.com you will see how index differ with google
-
Hi,
thanks for your help. "site:domain"-check in bing and yahoo show the exact same as google. So all the pages are currently in serps.
-
do similar check on bing and yahoo. they are much more effective and speedy than google but problem is market share that google have
-
Hi Andy,
sorry for disturbing, but I just did a check with "site:paleotogo.de" in google search and it found all pages.
In Search Console it tells me my blog has 54 out of 75 indexed (this was the sudden drop I spoke about earlier). But when I look further into the blog sitemap it shows me all 75 pages when I count them.
What is happening here? I really don't understand
Cheers, Marco
-
No problem at all. Just update here if it hasn't rectified itself soon and we can take another look
-Andy
-
Thanks for your help, Andy - really appreciate it. Hope everything will turn out well.
Just realized I forgot to mention the page it is about: www.paleotogo.de
-
Ah sorry, I missed that bit Marco.
When Google drop pages from the index, it can be for a whole host of reasons. However, Google never indexed 100% of pages (or very, very rarely). If you were at 95% and now at 75%, then this would suggest to me that Google has either lost some level of trust in the pages or you will just have to wait until the pages are re-indexed and Google has decided what to do with them again.
I would be tempted to wait for a little bit as over time, you should see pages being re-indexed again. If you have already re-submitted the sitemap, just make sure there is no problem with that - rebuild it and then re-submit if you haven't already, just to be sure.
-Andy
-
Hi Andy,
it's not about the drop of a specific keyword, it's about the drop of indexed pages going from 100% indexed to around 75% indexed.
Cheers
Marco
-
Hi Moosa,
thanks for your help! I did a resubmit 2 hours ago and just checked again, but it didn't index all of the pages (20 left again).
Strange.
-
Hi Marco,
It all depends on what you were tracking as phrases before, and now after you have made the changes. If you were tracking for the phrase "Red Sneakers" and changed the title to "Blue Sneakers" then you would expect to see a drop with your original phrase. Have you updated your tracking to compensate for the changes?
-Andy
-
Ideally just by changing the title tags the index should not be dropped but if this is something you have noticed, you can always update your sitemap.xml and resubmit it to Google and in the next crawl it should be indexed again.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do to index all my links of my website?
Ok, i have a new website, with only 14.000 page indexed by google, but the potential is big, 1-2 million pages. What i have to do, to force somehow google to index my website faster? This is my website: https://vmag.ro/
On-Page Optimization | | TeodorMarin0 -
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
Which is better? One dynamically optimised page, or lots of optimised pages?
For the purpose of simplicity, we have 5 main categories in the site - let's call them A, B, C, D, E. Each of these categories have sub-category pages e.g. A1, A2, A3. The main area of the site consists of these category and sub-category pages. But as each product comes in different woods, it's useful for customers to see all the product that come in a particular wood, e.g. walnut. So many years ago we created 'woods' pages. These pages replicate the categories & sub-categories but only show what is available in that particular wood. And of course - they're optimised much better for that wood. All well and good, until recently, these specialist page seem to have dropped through the floor in Google. Could be temporary, I don't know, and it's only a fortnight - but I'm worried. Now, because the site is dynamic, we could do things differently. We could still have landing pages for each wood, but of spinning off to their own optimised specific wood sub-category page, they could instead link to the primary sub-category page with a ?search filter in the URL. This way, the customer is still getting to see what they want. Which is better? One page per sub-category? Dynamically filtered by search. Or lots of specific sub-category pages? I guess at the heart of this question is? Does having lots of specific sub-category pages lead to a large overlap of duplicate content, and is it better keeping that authority juice on a single page? Even if the URL changes (with a query in the URL) to enable whatever filtering we need to do.
On-Page Optimization | | pulcinella2uk0 -
Keyword Appearing on Home Page - Moz Page Grader
Hi Today I entered www.partydomain.co.uk through the Moz Page Grader and found that the Home Page is Ranked B. I noticed that an Area we could improve on is the amount of times we are using our main keyword "Fancy Dress" on the home page. Please can you take a look at www.partydomain.co.uk and scroll to the bottom of the page were the tabs are containing losts of content. I am thinking about removing all of thoose Tabs. Our Competitors dont have any content as such on the home page and are ranking higher than Party Domain for "fancy dress" What do you think ? remove all the tabs to be like the others that rank better? Or cut the text right down ? Thanks Adam
On-Page Optimization | | AMG1000 -
Temporary Redirect pages
Hi, Temporary Redirect pages example when a non member goes to http://www.Somesite.com/detail/Username-Mike As he clicks the user names the user is directed to the login page http://www.Somesite.com/user/login We have 50K user accounts and 50K pages of content and each page has an option to comment and to comment user should be a member Moz campaing i get these 1,000's of links in Temporary Redirect page What is the action i can take thanks
On-Page Optimization | | mtthompsons0 -
Local Service Pages
We've all been here before if you do local. What type of content should go on a local service page when dealing with multiple service locations? You could: Describe Services List Local News Articles List staff in that location (although I would prefer in the staff page for that city) Testimonials from that location or service But what happens when you are describing something that needs no explanation. Or a medical procedure that requires no localization and altering the wording can actually cause legal problems if misstated. Matt Cuts recommends a few sentences to a paragraph to describe a service, but my experience hasn't found this to hold up locally. Any ideas or suggestions about how this could be remedied?
On-Page Optimization | | allenrocks0 -
Duplicate page
Just getting started and had a question regarding one of the reports. It is telling me that I have duplicate pages but I'm not sure how to resolve that.
On-Page Optimization | | KeylimeSocial0 -
Limiting On Page Links
Right now, we have about 160 or so links on the home page. It's been recommended that we keep it to under 100, though that's not as big of a deal as it once was. Is it helpful to make a bunch of those links "nofollow" in order to preserve link juice? Is it going to make a difference, or be at all helpful? I assume it won't be harmful, especially as a bunch of them are to the same page but on different sections of the page. Would live your advice and thoughts! Thanks!
On-Page Optimization | | DeliaAssociates0