Webpages & Images Index Graph Gone Down Badly in Google Search Console Why?
-
Hello All,
What is going on with Sitemap Index Status in Google Search Console :-
-
Webpages Submitted - 35000 index showing 21000 whereas previously approx 34500 were index.
-
Images Submitted - 85000 index showing - 11000 whereas previously approx 80000 were index.
Whereas when I search in google site:abcd.com is it showing approx 27000 index for webpages. No message from google for penalty or warning etc.Please help.
-
-
Hi Blue Corona,
I didn't did anything. So you mean to say there is no issue from google end as per given link of searchengineland?
Thanks!
-
Hi there!
I have a few questions for you to get a better idea of the situation. Did you redesign the site? Did you change your hosting company? Did you change who has access to the back-end of the site? Did you canonical any pages?
Any of these could be impacting the changes you're noticing with your indexed pages!
-
Hi All,
I think problem is going on with google only right? as per this post - http://searchengineland.com/google-says-google-index-status-search-console-report-broken-257111 ?
Thanks!
-
Hi Andy,
In webmaster search console now figures updated again index ratior increased from 21000 to 30000 and 11000 to 19000.
So do you think it is google issue? as per this - https://support.google.com/webmasters/answer/6211453#search_analytics ?
This is happening with me - http://searchengineland.com/bug-google-sitemap-index-counts-drop-across-search-console-reports-249472 but this post is of May 2016.
-
Hi Andy,
No big change done. Only in June 2016 I moved my site from http to https but till Aug everything. Problem started in Sept month.
Do everyone is facing same issue or I am wrong anywhere?
Thanks!
Wrights
-
Have you made some changes on your website, like adding noindex to the certain sections of your site or have you removed an internal link to a section meaning Google can't access that section. I would investigate firstly what changes have been made on the site.
Might also be worth checking your server logs to see which pages google are crawling - this might also help find out which pages they aren't indexing.
Reason why you might be seeing more pages when you do a site search is because it might take a few days to disappear from search.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Negative SEO & How long does it take for Google to disavow
Following on from a previous problem of this 2021 Waec Runz page completely dropping from index, we have discovered that 150+ spam, porn domains have been directed at our pages (sometime in the last 3-4 months, don't have an exact date). Does anyone have exerpeince on how long it may take Google to take noticed of a new disavow list? Any estimates would be very helpful in determining our next course of action.
Technical SEO | | sathoiue80 -
Search Console - Mobile Usability Errors
A site I'm looking at for a client had 100's of pages flagged as having Mobile Usability errors in Search Console. I found that the theme uses parameters in the URLs of some of theme resources (.js/.css) to identify the version strings. These were then being blocked by a rule in the robots.txt: "Disallow: /*?" I've removed this rule, and now when I inspect URLs and test the live versions of the page they are now being reported as mobile friendly. I then submitted validation requests in Search Console for both of the errors ("Text to small" and "Clickable Elements too close") My problem now, is that the validation has completed and the pages are still being reported as having the errors. I've double checked and they're find if I inspect them individually. Does anyone else have experience clearing these issues in Search Console? Any ideas what's going on here!
Technical SEO | | DougRoberts1 -
Google Search Console - URL Parameters Tab ISSUE
Hi, Recently i removed some disallowed parameters from my robots.txt and added the setting No Url in my search console URL parameters tab (as can be seen in the image http://prntscr.com/e997o5) Today i saw the orderby parameter indexed even if the setting is to not crawl those urls. Anyone any idea why is this happening? Thank god that all those urls with parameters are canonicalised to their original url's.
Technical SEO | | dos06590 -
Anything new if determining how many of a sites pages are in Google's supplemental index vs the main index?
Since site:mysite.com *** -sljktf stopped working to find pages in the supplemental index several years ago has anyone found another way to identify content that has been regulated to the supplemental index?
Technical SEO | | SEMPassion0 -
Why has Google stopped indexing my content?
Mystery of the day! Back on December 28th, there was a 404 on the sitemap for my website. This lasted 2 days before I noticed and fixed. Since then, Google has not indexed my content. However, the majority of content prior to that date still shows up in the index. The website is http://www.indieshuffle.com/. Clues: Google reports no current issues in Webmaster tools Two reconsideration requests have returned "no manual action taken" When new posts are detected as "submitted" in the sitemap, they take 2-3 days to "index" Once "indexed," they cannot be found in search results unless I include url:indieshuffle.com The sitelinks that used to pop up under a basic search for "Indie Shuffle" are now gone I am using Yoast's SEO tool for Wordpress (and have been for years) Before December 28th, I was doing 90k impressions / 4.5k clicks After December 28th, I'm now doing 8k impressions / 1.3k clicks Ultimately, I'm at a loss for a possible explanation. Running an SEOMoz audit comes up with warnings about rel=canonical and a few broken links (which I've fixed in reaction to the report). I know these things often correct themselves, but two months have passed now, and it continues to get progressively worse. Thanks, Jason
Technical SEO | | indieshuffle0 -
Is a Rel="cacnonical" page bad for a google xml sitemap
Back in March 2011 this conversation happened. Rand: You don't want rel=canonicals. Duane: Only end state URL. That's the only thing I want in a sitemap.xml. We have a very tight threshold on how clean your sitemap needs to be. When people are learning about how to build sitemaps, it's really critical that they understand that this isn't something that you do once and forget about. This is an ongoing maintenance item, and it has a big impact on how Bing views your website. What we want is end state URLs and we want hyper-clean. We want only a couple of percentage points of error. Is this the same with Google?
Technical SEO | | DoRM0 -
Do pages that are in Googles supplemental index pass link juice?
I was just wondering if a page has been booted into the supplemental index for being a duplicate for example (or for any other reason), does this page pass link juice or not?
Technical SEO | | FishEyeSEO0 -
Relocating to a C-level domain: Google Newsmaps & pagejuice
A client wants to relocate the majority of his niche news site, www.example.com, into a c-level domain on the same domain (news.example.com). Right now, almost all the articles are populating in Google News and generating traffic, thanks to the site's age, content and its newsmap. With the relocation, 80% of the files pulled into Google News, will now be news.example.com/tech/article-sample, as opposed to how they have been historically, www.example.com/tech/article-sample. Will that break Google News auto submission process, hurt their google news positioning, and require them to reapply for consideration? Secondly, a good chunk of landing pages (30+) and articles (3-4k) will be relocated to the new news.example.com domain. Everything after the .com/ will of course remain the same, but the c-level will be new so essentially, a new URL. I know that redirecting will lose some pagejuice, but since its a c-level, its going to be basically like moving the urls to a brand new domain?
Technical SEO | | EricPacifico0