More Indexed Pages than URLs on site.
-
According to webmaster tools, the number of pages indexed by Google on my site doubled yesterday (gone from 150K to 450K). Usually I would be jumping for joy but now I have more indexed pages than actual pages on my site.
I have checked for duplicate URLs pointing to the same product page but can't see any, pagination in category pages doesn't seem to be indexed nor does parameterisation in URLs from advanced filtration.
Using the site: operator we get a different result on google.com (450K) to google.co.uk (150K).
Anyone got any ideas?
-
Hi David,
Its tough to say without some more digging and information, it certainly looks like you have most of the common problem areas covered from what I can see. I will throw out an idea: I see you have a few 301 redirects in place switching from .html to non html versions. If this was done on a massive scale then possibly you have a google index with both versions of the pages in the index? If so it might not really be a big issue and over the next weeks/months the old .html versions will fall out of the index and your numbers will begin to look more normal again, Just a thought.
-
Thanks Lynn. The 31,000 was a bit of a legacy of issue and something we have solved. The robots file was changed a couple of weeks ago. So fingers crossed Google will deindex them soon. We get the same result when using inurl: where.
Any idea where the rest have come from?
-
Hi Irving
We checked everything obvious and cannot explain what is going on. I cannot see any major duplicate content issues and we do not have any subdomains active. The Moz crawler also doesn't highlight any major duplicate content issues.
-
Hi David,
Not sure why they started showing up now (some recent changes to the site?) but I suspect your problem is indexed urls that you are trying to block with robots.txt but are finding their way into the index somehow.
If you do a search for: site:nicontrols.com inurl:/manufacturer/ and then click on the show omitted results you will see a whole bunch (31000!) of 'content blocked by robots.txt' notices but the urls are still in the index. If you do a couple more similar searches looking for other likely url paths you will likely find some more.
If you can get a no-index meta tag into these pages I think it will be more effective in keeping them out of the index. If you have in mind some recent changes you have done to the site that might have introduced internal links to these pages then it would be worth looking to see if you can get the links removed or replaced with the 'proper' link format.
Hope that helps!
-
Can you see in the search the pages which are indexed and look for duplicates or technical issues causing improper indexing? Do you have other sites like subdomains Google might be counting as pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Not Indexing After 2 Weeks - PA at 1
Hi Moz Community! I'm working as a digital marketing consultant for an organization that uses us for their online registration - we do not manage their web page. The issue that I am hoping you might have some ideas on is that their SERPS still aren’t making much of a recovery since they revamped their site in mid August. I ran a MOZ campaign for them and despite that they (eventually) got all their 301s in place, they submitted an updated sitemap to Google, aren’t hitting any crawl errors, and have a working robots.txt over two-thirds of their site pages don’t seem to be indexing. MOZ is giving most of them a Page Authority of 1, and when I login to their GWT, it’s showing me that only 3 pages have been indexed of the 315 URLs submitted. I know Google doesn’t make any guarantees in index update timelines, but 2+ weeks seems like a long time 😞 Their website is https://www.northshoreymca.org/. The site has a DA of 43 but most of the pages on the main nav are still at 1. They gave me permission to share in this forum because we're really trying to figure out a recovery strategy. Any thoughts or ideas as to what might be causing this? Is there anything else that you think I should check or that might be causing an issue? Is it possible that Google is just taking this long to index their page? Note: this page is built with Drupal. THANK YOU!
Intermediate & Advanced SEO | | camarin_w0 -
Client wants to remove mobile URLs from their sitemap to avoid indexing issues. However this will require SEVERAL billing hours. Is having both mobile/desktop URLs in a sitemap really that detrimental to search indexing?
We had an enterprise client ask to remove mobile URLs from their sitemaps. For their website both desktop & mobile URLs are combined into one sitemap. Their website has a mobile template (not a responsive website) and is configured properly via Google's "separate URL" guidelines. Our client is referencing a statement made from John Mueller that having both mobile & desktop sitemaps can be problematic for indexing. Here is the article https://www.seroundtable.com/google-mobile-sitemaps-20137.html
Intermediate & Advanced SEO | | RosemaryB
We would be happy to remove the mobile URLs from their sitemap. However this will unfortunately take several billing hours for our development team to implement and QA. This will end up costing our client a great deal of money when the task is completed. Is it worth it to remove the mobile URLs from their main website to be in adherence to John Mueller's advice? We don't believe these extra mobile URLs are harming their search indexing. However we can't find any sources to explain otherwise. Any advice would be appreciated. Thx.0 -
How to optimize count of interlinking by increasing Interlinking count of chosen landing pages and decreasing for less important pages within the site?
We have taken out our interlinking counts (Only Internal Links and not Outbound Links) through Google WebMaster tool and discovered that the count of interlinking of our most significant pages are less as compared to of less significant pages. Our objective is to reverse the existing behavior by increasing Interlinking count of important pages and reduce the count for less important pages so that maximum link juice could be transferred to right pages thereby increasing SEO traffic.
Intermediate & Advanced SEO | | vivekrathore0 -
302 redirected URLs - login, account pages
We have a 302 redirection on some of our pages which involved login/account pages. So, some pages are 302 (temporarily) redirected to the login pages which is common especially in e-commerce sites (see screenshot). For SEO practices, what would be best to address this (if this an issue)? a. Block the login/account pages using robots.txt? b. Block the login/account pages using meta noindex? c. Leave them as is since it's a non-issue. d. Other recommendations, please specify in the answers.. Thanks! 2S9xn
Intermediate & Advanced SEO | | jayoliverwright0 -
How to 301 Redirect /page.php to /page, after a RewriteRule has already made /page.php accessible by /page (Getting errors)
A site has its URLs with php extensions, like this: example.com/page.php I used the following rewrite to remove the extension so that the page can now be accessed from example.com/page RewriteCond %{REQUEST_FILENAME}.php -f
Intermediate & Advanced SEO | | rcseo
RewriteRule ^(.*)$ $1.php [L] It works great. I can access it via the example.com/page URL. However, the problem is the page can still be accessed from example.com/page.php. Because I have external links going to the page, I want to 301 redirect example.com/page.php to example.com/page. I've tried this a couple of ways but I get redirect loops or 500 internal server errors. Is there a way to have both? Remove the extension and 301 the .php to no extension? By the way, if it matters, page.php is an actual file in the root directory (not created through another rewrite or URI routing). I'm hoping I can do this, and not just throw a example.com/page canonical tag on the page. Thanks!0 -
Same day indexing... some tips for a non blog site
Hi There's a question today about paying for Yahoo directory - I wondered if these kinds of links would help with getting indexed. We're adding about 10-20 pages of new unique content each day, but it takes google about 4 days to list it and then it only lists a few pages of that 10-20 pages, we have other older sites and blogs that are within the hour. What tips can you give me for getting at least a same day index, and a thorough one at that, so if we publish 20 pages on Thursday by the end of play Friday they're indexed? Would the Yahoo Directory help?
Intermediate & Advanced SEO | | xoffie0 -
Rich snippet on main index page
Hello, I am using a 3rd party company to generate reviews for my website. I want to optimize my site for the index page to see a star rating in the SERP. I am pulling the the count of the number of reviews and the average rating from my review partner and rendering this on the page. It is not visible to a visitor to the site. My page has been marked up correctly as you can see using the rich snippet testing tool http://www.google.com/webmasters/tools/richsnippets?url=http%3A%2F%2Fwww.jsshirts.com.au However the stars are not showing in SERP's. Does anyone have any ideas as to why the stars are not showing. Many thanks, Jason
Intermediate & Advanced SEO | | mullsey0 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0