Google Index Constantly Decreases Week over Week (for over 1 year now)
-
Hi,
I recently started working with two products (one is community driven content), the other is editorial content, but I've seen a strange pattern in both of them.
The Google Index constantly decreases week over week, for at least 1 year. Yes, the decrease increased when the new Mobile version of Google came out, but it was still declining before that.
Has it ever happened to you? How did you find out what was wrong? How did you solve it?
What I want to do is take the sitemap and look for the urls in the index, to first determine which are the missing links. The problem though is that the sitemap is huge (6 M pages). Have you find out a solution on how to deal with such big index changes?
Cheers,
Andrei
-
Thanks EGOL. Makes sense.
-
My response to this is to build small, compact websites where multiple related items are sold on the same page.
If products are grouped wisely by keywords, you can still have a broad keyword reach from a smaller site. Those longer pages have more words, more images, more buy buttons and are respected better by Google. Thus they rank better than short skimply pages.
Big websites that struggle to hold pages in the index do not compete well.
A smaller site does not spread your power out amongst a large number of pages. The smaller the site the more competitive it will be against other larger websites of similar total linkstrength and authority.
-
Thanks EGOL for explaining. It makes sense. So Google discovers the urls via sitemap / internal link structure but then forgets about them because nobody reminds Google of them by linking to them.
I do understand that this is a common problem, so what set of tactics has worked so far for you in solving it?
-
If you have a website with six million pages, you will need a lot of inbound links to get all of those pages in to the index and held there. When Google discovers your pages they will index them but if spiders do not revisit again, google will forget them.
For a website to hold six million pages in the index, it would need hundreds of powerful inbound links to thousands of average inbound links from other websites to deliver enough spider action. These links must be permanent to maintain a flow of spiders into the website. If they are temporarily applied the flow of spiders will be cut off and google will forget about those pages.
Also needed would be a good linkage structure that channels the spiders deep into those millions of pages so that they will be forced to chew their way out through other pages.
Weak websites with too many pages are a common problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why my subcategories google index much faster than my head categorienames
I have categories whirlpools . saunen . dampfduschn . etc and got sub categories Ts- Serie Whirlpool Modelle T15 Serie Whirlpool Modelle I have changed the title of the head categories and also the sub categories, but google change the title of my subcatgegoreis very quick and now 4 days later still the head navigigation not changed what does that mean ? google index my head navigation bad ? regards
Intermediate & Advanced SEO | | HolgerL
Marcel0 -
Google Indexing Request - Typical Time to Complete?
In Google Search Console, when you request the (re) indexing of a fetched page, what's the average amount of time it takes to re-index and does it vary that much from site to site or are manual re-index request put in a queue and served on a first come - first serve basis despite the site characteristics like domain/page authority?
Intermediate & Advanced SEO | | SEO18050 -
Can you index a Google doc?
We have updated and added completely new content to our state pages. Our old state content is sitting in a our Google drive. Can I make these public to get them indexed and provide a link back to our state pages? In theory it sounds like a great link building strategy... TIA!
Intermediate & Advanced SEO | | LindsayE1 -
Google not taking Meta...
Hello all, So I understand that Google may sometimes take content from the page as a snippet to display on SERPs rather than the meta description, but my problem goes a little beyond that. I have a section on my site which updates everyday so a lot of the content is dynamics (products for a shop, every morning unique stock is added or removed), and despite having a meta description, title and receiving an 'A' grade in the MOZ on page grader, these pages never show up in Google. After a little research I did a 'site:www.mysite.com/productpage' in Google and this indeed listed all my products, but interestingly for every single one Google had taken the copyright notice at the bottom of the page as the snippet instead of the meta or any H1, H2 or P text on the page... Does anyone have any idea why Google is doing this? It would explain a lot to me in terms of overall traffic, I'm just out of ideas... Thanks!
Intermediate & Advanced SEO | | HB170 -
Same URLS different CMS and server set up. Page Authority now 1
We have moved a clients website over to a new CMS and onto a new server. The Domain and URLs on the main pages of the website are exactly the same so we did not do any 301 re directs. The overall Domain Authority of the site and the Page Authority of the Homepage, while having dropped a bit seem OK. However all the other pages now have a Pagerank of 1 I'm not exactly sure what the IT guys have done but there was some re routing on the server level applied. The move happened around the end of December 2014 And yes traffic has dropped significantly Any ideas?
Intermediate & Advanced SEO | | daracreative0 -
Indexing isolated webpages
Hi all,
Intermediate & Advanced SEO | | Tarek_Lel
We are running a classifieds website.Due to technical limitations, we will probably not be able to list or search expired ads, but we still can view ad details view page if you landed on expired ad from external page (or google search results).Our concern is, if the ad page is still exists, but it's totally isolated from the website (i.e not found by search option on the website and no following site links) will google remove it from the index?Thanks, T0 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
1 of the sites i work on keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page
1 of the sites i work on (www.eva-alexander.com) keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page I have no idea why and have never experienced this before
Intermediate & Advanced SEO | | GMD10