Google Indexing
-
Hi
We have roughly 8500 pages in our website. Google had indexed almost 6000 of them, but now suddenly I see that the pages indexed has gone to 45.
Any possible explanations why this might be happening and what can be done for it.
Thanks,
Priyam
-
Hi,
I am also facing a similar issue.
My website is https://infinitelabz.com. When I try to crawl and index the site it is saying not able to crawl.
-
check the robots.txt for no-follow designations?
also, is your hosting reliable? i have had websites go down, which would cause crawler errors from reporting, resulting in a poor index.
-
I have done that all already actually and there is nothing unusual. So it's confusing
-
I have tried that and it fetches them correctly.
-
The only time it's ever really hit me hard and fast like that is on Tumblr. with adult content. Once they find out about it, they flip the robot.txt hide switch and you're burnt lol.
But ya like taryn suggested go into Google webmasters and have a look around the property and all the options starting from the messages/mailbox thing within webmasters.
-
Have you checked your traffic and ranks? This could just be an issue with index reporting. If traffic and ranks are stable then no need to worry.
If however traffic has declined, there is certainly an issue, check:
- have you received a penalty
- use site: command to see what URLs are actually showing Google index
- fetch and render to see how Google see's your pages
- run a crawl of your site using Screaming Frog or other such tool
- is there issue's with 404 / 500 or No response pages
- has dev deployed anyting like moving to https without applying 301 redirects
I think first is to determine, if this is an actual issue or not. Then if it is, run some serious analysis to determine cause and apply a fix.
Many thanks,
-
Hi,
Have you tried to fetch your pages too see if Google can read them?
See: https://support.google.com/webmasters/answer/6066468?hl=en
It is a good place to start.
BTW: I have tried something similar with a home built CMS that Google for some reason didn't like
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not indexing images
Hi there, We have a strange issue at a client website (www.rubbermagazijn.nl). Webpage are indexed by Google but images are not, and have never been since the site went live in '12 (We recently started SEO work on this client). Similar sites like www.damenrubber.nl are being indexed correctly. We have correct robots and sitemap setup and directions. Fetch as google (Search Console) shows all images displayed correctly (despite scripted mouseover on the page) Client doesn't use CDN Search console shows 2k images indexed (out of 18k+) but a site:rubbermagazijn.nl query shows a couple of images from PDF files and some of the thumbnails, but no productimages or category images from homepage. (product page example: http://www.rubbermagazijn.nl/collectie/slangen/olie-benzineslangen/7703_zwart_nbr-oliebestendig-6mm-l-1000mm.html) We've changed the filenames from non-descriptive names to descriptive names, without any result. Descriptive alt texts were added We're at a loss. Has anyone encountered a similar issue before, and do you have any advice? I'd be happy to provide more information if needed. CBqqw
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
How to leverage Google Images?
My Google search rankings are improving rapidly at the moment, but a lot of my rankings are for images (presume that means the images are appearing near the top in Google Images). How do I capitalise on that? It's not really much help to me that my images are popular unless it results in traffic to the pages where those images are used. I am running Wordpress so I have the option to have images embed as "no link", "link to attachment page", "link to original image", etc. Is there any advantage of using one of these over the other? I'd really like to set it up so that when a Google Images user clicks "View Image" it loads the attachment page or the host content page rather than the image. Bad SEO? I'm not sure if the fact that I'm using Jetpack Photon CDN image hosting will make this more complicated or not. Tony
Intermediate & Advanced SEO | | Gavin.Atkinson0 -
Pages getting into Google Index, blocked by Robots.txt??
Hi all, So yesterday we set up to Remove URL's that got into the Google index that were not supposed to be there, due to faceted navigation... We searched for the URL's by using this in Google Search.
Intermediate & Advanced SEO | | bjs2010
site:www.sekretza.com inurl:price=
site:www.sekretza.com inurl:artists= So it brings up a list of "duplicate" pages, and they have the usual: "A description for this result is not available because of this site's robots.txt – learn more." So we removed them all, and google removed them all, every single one. This morning I do a check, and I find that more are creeping in - If i take one of the suspecting dupes to the Robots.txt tester, Google tells me it's Blocked. - and yet it's appearing in their index?? I'm confused as to why a path that is blocked is able to get into the index?? I'm thinking of lifting the Robots block so that Google can see that these pages also have a Meta NOINDEX,FOLLOW tag on - but surely that will waste my crawl budget on unnecessary pages? Any ideas? thanks.0 -
Google and JavaScript
Hey there! Recent announcements at Google to encourage webmasters to let Google crawl Java Script http://www.googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html http://googlewebmastercentral.blogspot.com/2014/05/rendering-pages-with-fetch-as-google.html We have always put JS and CSS behind robots.txt, but now considering taking them out of robots. Any opinions on this?
Intermediate & Advanced SEO | | CleverPhD0 -
Adding Orphaned Pages to the Google Index
Hey folks, How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls. These pages are super low competition. The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal) a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned. b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them? c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned? d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google. Thanks for your opinions and if you have any hard evidence either way especially thanks for that info. 😉
Intermediate & Advanced SEO | | irvingw0 -
GOOGLE Rankings
We built a site for one of our clients about 3 years ago and added Meta tags to each page of the site. But when you do a google search on the keywords the site still does not come up. I thought it would be spidered into the system by now. Web Site - www.specialtysealgroup.com Let me know what we can do to improve our GOOGLE rankings.
Intermediate & Advanced SEO | | thine1230 -
Need to duplicate the index for Google in a way that's correct
Usually duplicated content is a brief to fix. I find myself in a little predicament: I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal. The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads. So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines. I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path. So the question is: Have someone else stumbled upon a similar problem? If so...? How did you fix it.
Intermediate & Advanced SEO | | Gustav-Northclick0 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0