DeIndexing pagination
-
I have a custom made blog with boat loads of undesirable URLs in Google's index like this:
.com/resources?start=150
.com/resources?start=160
.com/resources?start=170I've identified this is a source of duplicate title tags and had my programmer put a no index tag to automatically go on all of these undesirable URLs like this:
However doing a site: search in google shows the URLs to still be indexed even though I've put the tag up a few weeks ago.
How do I get google to remove these URLs from the index? I'm aware that the Search Console has an answer here https://support.google.com/webmasters/topic/4598466?authuser=1&authuser=1&rd=1 but it says that blocking with meta tags should work.
Do I just get google to crawl the URL again so it sees the tag and then deindexes the URLs? Or is there another way I'm missing.
-
Adding a meta noindex tag can mean it takes a few weeks for a page to fall out of the index. These pages probably aren't doing you much harm, so if you wanted to just wait for them to fall out, that's probably fine (although I would update the tag content to "noindex, follow" to help Google crawl to the other noindexed pages). If you really want them out of the index faster, you could use the "Remove URLs" function under Google Index in Google Search Console, which will temporarily remove them from the index while Google is registering the noindex tags, or you can use the Fetch + Render tool and then Submit URLs in Google Search Console, which will cause Google to come back and crawl your pages and find the noindex tag.
-
You could use URL parameter settings in Google Search Console and Bing Webmaster Tools - if all ?start= URLs can be treated the same way by Google.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Pagination Changes
What with Google recently coming out and saying they're basically ignoring paginated pages, I'm considering the link structure of our new, sooner to launch ecommerce site (moving from an old site to a new one with identical URL structure less a few 404s). Currently our new site shows 20 products per page but with this change by Google it means that any products on pages 2, 3 and so on will suffer because google treats it like an entirely separate page as opposed to an extension of the first. The way I see it I have one option: Show every product in each category on page 1. I have Lazy Load installed on our new website so it will only load the screen a user can see and as they scroll down it loads more products, but how will google interpret this? Will Google simply see all 50-300 products per category and give the site a bad page load score because it doesn't know the Lazy Load is in place? Or will it know and account for it? Is there anything I'm missing?
Intermediate & Advanced SEO | | moon-boots0 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
How To Implement Pagination Properly? Important and Urgent!
I have seen many instructions but I am still uncertain. Here is the situation We will be implementing rel prev rel next on our paginted pages. The question is: Do we implement self referencing canonical URL on the main page and each paginated page? Do we implement noindex/follow meta robots tag on each paginated page? Do we include the canonical URL for each paginated page in the sitemap if we do not add the meta robots tag? We have a view all but will not be using it due to page load capabilities...what do we do with the viewl all URL? Do we add meta robots to it? For website search results pages containing pagination should we just put a noindex/follow meta robots tag on them? We have seperate mobile URL's that also contain pagination. Do we need to consider these pages as a seperate pagination project? We already canonical all the mobile URL's to the main page of the desktop URL. Thanks!
Intermediate & Advanced SEO | | seo320 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
What's the best way to redirect categories & paginated pages on a blog?
I'm currently re-doing my blog and have a few categories that I'm getting rid of for housecleaning purposes and crawl efficiency. Each of these categories has many pages (some have hundreds). The new blog will also not have new relevant categories to redirect them to (1 or 2 may work). So what is the best place to properly redirect these pages to? And how do I handle the paginated URLs? The only logical place I can think of would be to redirect them to the homepage of the blog, but since there are so many pages, I don't know if that's the best idea. Does anybody have any thoughts?
Intermediate & Advanced SEO | | kking41200 -
Should I create a separate sitemap.xml for paginated categories?
For example: http://www.site.com/category/sub-category http://www.site.com/category/sub-category/1 http://www.site.com/category/sub-category/2 http://www.site.com/category/sub-category/3 Thanks in advance! 🙂
Intermediate & Advanced SEO | | esiow20130 -
What SEO Experts say about Pagination on PakWheels.com?
Hi SEOmozers... I need your expert feedback regarding SEO of listing pages with pagination. Crawl following links and write down your advice: Used Cars Car Reviews Listing New Honda Cars Actually these are the search listing pages with pagination. Please provide specialized recommendations for On page enhancements. Looking forward to see answers with pagination best practices.
Intermediate & Advanced SEO | | razasaeed0 -
Pagination Question: Google's 'rel=prev & rel=next' vs Javascript Re-fresh
We currently have all content on one URL and use # and Javascript refresh to paginate pages, and we are wondering if we transition to the Google's recommended pagination if we will see an improvement in traffic. Has anyone gone though a similar transition? What was the result? Did you see an improvement in traffic?
Intermediate & Advanced SEO | | nicole.healthline0