Huge number of indexed pages with no content
-
Hi,
We have accidentally had Google indexed lots os our pages with no useful content at all on them.
The site in question is a directory site, where we have tags and we have cities. Some cities have suppliers for almost all the tags, but there are lots of cities, where we have suppliers for only a handful of tags.
The problem occured, when we created a page for each cities, where we list the tags as links.
Unfortunately, our programmer listed all the tags, so not only the ones, where we have businesses, offering their services, but all of them!
We have 3,142 cities and 542 tags. I guess, that you can imagine the problem this caused!
Now I know, that Google might simply ignore these empty pages and not crawl them again, but when I check a city (city site:domain) with only 40 providers, I still have 1,050 pages indexed. (Yes, we have some issues between the 550 and the 1050 as well, but first things first:))
These pages might not be crawled again, but will be clicked, and bounces and the whole user experience in itself will be terrible.
My idea is, that I might use meta noindex for all of these empty pages and perhaps also have a 301 redirect from all the empty category pages, directly to the main page of the given city.
Can this work the way I imagine? Any better solution to cut this really bad nightmare short?
Thank you in advance.
Andras
-
Thank you again, John. I will fix this, based on our discussion.
-
NoIndex I think is slightly superfluous as the 301 will take care of it and also point people to a proper result and give Google a redirected result.
However SEOMoz's Robots information page page suggests:
"In most cases, meta robots with parameters
"noindex, follow"
should be employed as a way to to restrict crawling or indexation."- So maybe consider that...
As for Robots, you can check out SEOMoz's Robots information page where it has information on wildcards, which you could use, which I THINK would work (i.e. http://domain.com/*/tags ?
Not quite sure on that last bit though...
-
Thank you for your reply, Josh.
I will then use the 301, but should I also use the noindex tag for these pages to be removed from the index?
Does it make an emphasis on my intention, or it adds no extra to the process? Perhaps, they should not be used together at all, as basically they are meant for different tasks.
(Unfortunatyly, robots.txt is not really a solution, as we have the following url structure:
Since all the cities have at least a couple of valid tags, I can't specify the path to be excluded from indexing. I would also try not to add 2,000+ cities individually.
As for GWT, url removal for this number of pages might also not be an option, as I have minimum 100,000+ no-value pages to be removed (the limit is 500 per month).)
-
I would agree, just setup a 301 redirect so that users don't bounce and actually get directed to something remotely useful, even just a listing of all the tags around the site or a home page or something (even if you do the below, to ensure users who stumble on these pages are still happy).
You could also use a robots.txt file to show which ones you don't want to be indexed, and finally you may also use Google's Webmaster Tools to manually remove particular pages!
A combo of all of those will work a treat!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
My pages are being crawled, but not indexed according to Search Console
According to Google Search Console, my pages are being crawled by not indexed. We use Shopify and about two weeks ago I selected that Traffic from all our domains redirects to our primary domain. So everything from www.url.com and https://url.com and so on, would all redirect to one url. Have added an attached image from Search Console. 6fzEQg8
Technical SEO | | HariOmHemp0 -
Sudden Indexation of "Index of /wp-content/uploads/"
Hi all, I have suddenly noticed a massive jump in indexed pages. After performing a "site:" search, it was revealed that the sudden jump was due to the indexation of many pages beginning with the serp title "Index of /wp-content/uploads/" for many uploaded pieces of content & plugins. This has appeared approximately one month after switching to https. I have also noticed a decline in Bing rankings. Does anyone know what is causing/how to fix this? To be clear, these pages are **not **normal /wp-content/uploads/ but rather "index of" pages, being included in Google. Thank you.
Technical SEO | | Tom3_150 -
Pages are Indexed but not Cached by Google. Why?
Hello, We have magento 2 extensions website mageants.com since 1 years google every 15 days cached my all pages but suddenly last 15 days my websites pages not cached by google showing me 404 error so go search console check error but din't find any error so I have cached manually fetch and render but still most of pages have same 404 error example page : - https://www.mageants.com/free-gift-for-magento-2.html error :- http://webcache.googleusercontent.com/search?q=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&rlz=1C1CHBD_enIN803IN804&oq=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&aqs=chrome..69i57j69i58.1569j0j4&sourceid=chrome&ie=UTF-8 so have any one solutions for this issues
Technical SEO | | vikrantrathore0 -
What would cause a sudden drop in indexed sitemap pages?
I have made no changes to my site for awhile and on 7/14 I had a 20% drop in indexed pages from the sitemap. However my total indexed pages has stayed the same. What would cause that?
Technical SEO | | EcommerceSite0 -
How do I fix this type of duplicate page content problem?
Sample URLs with this Duplicate Page Content URLs Internal Links External Links Page Authority Linking Root Domains http://rogerelkindlaw.com/index.html 30 0 26 1 http://www.rogerelkindlaw.com/index.html 30 0 20 1 http://www.rogerelkindlaw.com/ | 1,630 | 613 | 43 | 110 | As you can see there are three duplicate pages; http://rogerelkindlaw.com/index.html http://www.rogerelkindlaw.com/index.html http://www.rogerelkindlaw.com/ What would be the best and most efficient way to fix this problem and also how to prevent this from happening? Thank you.
Technical SEO | | brianhughes0 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
Tags causing Duplicate page content?
I was looking through the 'Duplicate Page Content' and Too Many On-Page Link' errors and they all seem to be linked to the 'Tags' on my blog pages. Is this really a problem and if so how should I be using tags properly to get the best SEO rewards?
Technical SEO | | zapprabbit1