Google Index Constantly Decreases Week over Week (for over 1 year now)
-
Hi,
I recently started working with two products (one is community driven content), the other is editorial content, but I've seen a strange pattern in both of them.
The Google Index constantly decreases week over week, for at least 1 year. Yes, the decrease increased when the new Mobile version of Google came out, but it was still declining before that.
Has it ever happened to you? How did you find out what was wrong? How did you solve it?
What I want to do is take the sitemap and look for the urls in the index, to first determine which are the missing links. The problem though is that the sitemap is huge (6 M pages). Have you find out a solution on how to deal with such big index changes?
Cheers,
Andrei
-
Thanks EGOL. Makes sense.
-
My response to this is to build small, compact websites where multiple related items are sold on the same page.
If products are grouped wisely by keywords, you can still have a broad keyword reach from a smaller site. Those longer pages have more words, more images, more buy buttons and are respected better by Google. Thus they rank better than short skimply pages.
Big websites that struggle to hold pages in the index do not compete well.
A smaller site does not spread your power out amongst a large number of pages. The smaller the site the more competitive it will be against other larger websites of similar total linkstrength and authority.
-
Thanks EGOL for explaining. It makes sense. So Google discovers the urls via sitemap / internal link structure but then forgets about them because nobody reminds Google of them by linking to them.
I do understand that this is a common problem, so what set of tactics has worked so far for you in solving it?
-
If you have a website with six million pages, you will need a lot of inbound links to get all of those pages in to the index and held there. When Google discovers your pages they will index them but if spiders do not revisit again, google will forget them.
For a website to hold six million pages in the index, it would need hundreds of powerful inbound links to thousands of average inbound links from other websites to deliver enough spider action. These links must be permanent to maintain a flow of spiders into the website. If they are temporarily applied the flow of spiders will be cut off and google will forget about those pages.
Also needed would be a good linkage structure that channels the spiders deep into those millions of pages so that they will be forced to chew their way out through other pages.
Weak websites with too many pages are a common problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
How to do Country specific indexing ?
We are a business that operate in South East Asian countries and have medical professionals listed in Thailand, Philippines and Indonesia. When I go to Google Philippines and check I can see indexing of pages from all countries and no Philippines pages. Philippines is where we launched recently. How can I tell Google Philippines to give more priority to pages from Philippines and not from other countries Can someone help?
Intermediate & Advanced SEO | | ozil0 -
Https & http urls in Google Index
Hi everyone, this question is a two parter: I am now working for a large website - over 500k monthly organic traffic. The site currently has both http and https urls in Google's index. The website has not formally converted to https. The https began with an error and has evolved unchecked over time. Both versions of the site (http & https) are registered in webmaster tools so I can clearly track and see that as time passes http indexation is decreasing and https has been increasing. The ratio is at about 3:1 in favor of https at this time. Traffic over the last year has slowly dipped, however, over the last two months there has been a steady decline in overall visits registered through analytics. No single page appears to be the culprit, this decline is occurring across most pages of the website, pages which traditionally draw heavy traffic - including the home page. Considering that Google is giving priority to https pages, could it be possible that the split is having a negative impact on traffic as rankings sway? Additionally, mobile activity for the site has steadily increased both from a traffic and a conversion standpoint. However that traffic has also dipped significantly over the last two months. Looking at Google's mobile usability error's page I see a significant number of errors (over 1k). I know Google has been testing and changing mobile ranking factors, is it safe to posit that this could be having an impact on mobile traffic? The traffic declines are 9-10% MOM. Thank you. ~Geo
Intermediate & Advanced SEO | | Geosem0 -
Apps content Google indexation ?
I read some months back that Google was indexing the apps content to display it into its SERP. Does anyone got any update on this recently ? I'll be very interesting to know more on it 🙂
Intermediate & Advanced SEO | | JoomGeek0 -
Shoing strong for last 2 years for search terms NOW GONE! What happened?
Hi All! I have a 9-11~ my website www.popscrap.com has been showing strong (top 2) for about 2 years now for many of the search terms we are targeting (scrap software; scrap medal software; recycling software; etc.), and I just noticed today that we are nowhere. What do you suggest for troubleshooting this to find the cause and fix? Thanks!
Intermediate & Advanced SEO | | BBuck0 -
Remove content that is indexed?
Hi guys, I want to delete a entire folder with content indexed, how i can explain to google that content no longer exists?
Intermediate & Advanced SEO | | Valarlf0 -
Google Local oddity
So I spotted something a little weird... one of my client's Google Local placements in blended results has the domain name - complete with the .com extension appearing where the business name typically appears: Businessxyz.com www. businessxyz .com of Google reviews Has anyone seen this? I setup their Google Places account quite some time ago and used the business name - not the url. I also setup their Google+ and Local page - using the name. None of the page titles on the website contain the url. I simply can not pinpoint where G is pulling this from or why for that matter. All competitors are appearing with business name - only my client has the domain name visible for the particular local search query. Any ideas?
Intermediate & Advanced SEO | | SCW0 -
Do you bother cleaning duplicate content from Googles Index?
Hi, I'm in the process of instructing developers to stop producing duplicate content, however a lot of duplicate content is already in Google's Index and I'm wondering if I should bother getting it removed... I'd appreciate it if you could let me know what you'd do... For example one 'type' of page is being crawled thousands of times, but it only has 7 instances in the index which don't rank for anything. For this example I'm thinking of just stopping Google from accessing that page 'type'. Do you think this is right? Do you normally meta NoIndex,follow the page, wait for the pages to be removed from Google's Index, and then stop the duplicate content from being crawled? Or do you just stop the pages from being crawled and let Google sort out its own Index in its own time? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0