Google Indexing
-
Hi
We have roughly 8500 pages in our website. Google had indexed almost 6000 of them, but now suddenly I see that the pages indexed has gone to 45.
Any possible explanations why this might be happening and what can be done for it.
Thanks,
Priyam
-
Hi,
I am also facing a similar issue.
My website is https://infinitelabz.com. When I try to crawl and index the site it is saying not able to crawl.
-
check the robots.txt for no-follow designations?
also, is your hosting reliable? i have had websites go down, which would cause crawler errors from reporting, resulting in a poor index.
-
I have done that all already actually and there is nothing unusual. So it's confusing
-
I have tried that and it fetches them correctly.
-
The only time it's ever really hit me hard and fast like that is on Tumblr. with adult content. Once they find out about it, they flip the robot.txt hide switch and you're burnt lol.
But ya like taryn suggested go into Google webmasters and have a look around the property and all the options starting from the messages/mailbox thing within webmasters.
-
Have you checked your traffic and ranks? This could just be an issue with index reporting. If traffic and ranks are stable then no need to worry.
If however traffic has declined, there is certainly an issue, check:
- have you received a penalty
- use site: command to see what URLs are actually showing Google index
- fetch and render to see how Google see's your pages
- run a crawl of your site using Screaming Frog or other such tool
- is there issue's with 404 / 500 or No response pages
- has dev deployed anyting like moving to https without applying 301 redirects
I think first is to determine, if this is an actual issue or not. Then if it is, run some serious analysis to determine cause and apply a fix.
Many thanks,
-
Hi,
Have you tried to fetch your pages too see if Google can read them?
See: https://support.google.com/webmasters/answer/6066468?hl=en
It is a good place to start.
BTW: I have tried something similar with a home built CMS that Google for some reason didn't like
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Specific page does not index
Hi, First question: Working on the indexation of all pages for a specific client, there's one page that refuses to index. Google Search console says there's a robots.txt file, but I can't seem to find any tracks of that in the backend, nor in the code itself. Could someone reach out to me and tell me why this is happening? The page: https://www.brody.be/nl/assistentiewoningen/ Second question: Google is showing another meta description than the one our client gave in in Yoast Premium snippet. Could it be there's another plugin overwriting this description? Or do we have to wait for it to change after a specific period of time? Hope you guys can help
Intermediate & Advanced SEO | | conversal0 -
Blocking Certain Site Parameters from Google's Index - Please Help
Hello, So we recently used Google Webmaster Tools in an attempt to block certain parameters on our site from showing up in Google's index. One of our site parameters is essentially for user location and accounts for over 500,000 URLs. This parameter does not change page content in any way, and there is no need for Google to index it. We edited the parameter in GWT to tell Google that it does not change site content and to not index it. However, after two weeks, all of these URLs are still definitely getting indexed. Why? Maybe there's something we're missing here. Perhaps there is another way to do this more effectively. Has anyone else ran into this problem? The path we used to implement this action:
Intermediate & Advanced SEO | | Jbake
Google Webmaster Tools > Crawl > URL Parameters Thank you in advance for your help!0 -
Google Penalty - Has It Been Lifted?
Hi, We have been trying to remove a ‘partial’ google penalty for a new client by the way of removing unnatural backlinks over a period of time and then submitting a reconsideration request, and uploading a disavow file etc. Previously Google listed the partial penalty in the ‘manual actions’ section of webmaster tools, making it possible for us to submit a reconsideration request. Having just logged in however we get the message ‘no manual webspam actions found’. So there isn’t any way we can submit a reconsideration request. Does this mean that the penalty has been lifted? Or could it still exist? If the latter is there any other way to submit a reconsideration request? Many thanks in advance, Lee.
Intermediate & Advanced SEO | | Webpresence0 -
Why is this site not indexed by Google?
Hi all and thanks for your help in advance. I've been asked to take a look at a site, http://www.yourdairygold.ie as it currently does not appear for its brand name, Your Dairygold on Google Ireland even though it's been live for a few months now. I've checked all the usual issues such as robots.txt (doesn't have one) and the robots meta tag (doesn't have them). The even stranger thing is that the site does rank on Yahoo! and Bing. Google Webmaster Tools shows that Googlebot is crawling around 150 pages a day but the total number of pages indexed is zero. It does appear if you carry out a site: search on Google however. The site is very poorly optimised in terms of title tags, unnecessary redirects etc which I'm working on now but I wondered if you guys had any further insights. Thanks again for your help.
Intermediate & Advanced SEO | | iProspect-Ireland0 -
Google and private networks?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates) and still do extremely well. Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
Intermediate & Advanced SEO | | BobAnderson0 -
Google disavow tool
I have an algorithmic penalty on one of my websites. I never received a notification of a manual penalty in GWMT and even sent in a reconsideration request 6 months ago ad they told me their were no manual penalties on the website. I have cleaned up my link profile and what I could not clean up I sent in using the Google disavow tool a few days ago. I've heard to just wait if it's algorithmic or should I send in another reconsideration request for disavow links tool?
Intermediate & Advanced SEO | | MarkHIggins0 -
Sitemaps / Google Indexing / Submitted
We just submitted a new sitemap to google for our new rails app - http://www.thesquarefoot.com/sitemap.xml Which has over 1,400 pages, however Google is only seeing 114. About 1,200 are in the listings folder / 250 blog posts / and 15 landing pages. Any help would be appreciated! Aron sitemap.png
Intermediate & Advanced SEO | | TheSquareFoot0 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0