Best practice for removing pages
-
I've got some crappy pages that I want to delete from a site.
I've removed all the internal links to those pages and resubmitted new site maps that don't show the pages anymore, however the pages still index in search (as you would expect).
My question is, what's the best practice for removing these pages?
Should I just delete them and be done with it or make them 301 re-direct to a nicer generic page until they are removed from the search results?
-
Thanks for all the responses.
Think I'll opt for a server 301 to a suitable page
-
An alternative for removing them from search results, besides a 301, is meta robots - noindex. You would do this if you do not want them passing any link value to 301d pages, which it doesn't sound like you would want them to.
-Dan
-
Hi Shelly
If the pages are 'crappy' as you describe, then a good idea to delete them as poor quality pages can negatively affect the site overall.
If you are going to delete the pages, then yes, cisaz is right, ensure they are 301 redirected to a relevant page to retain some of the link benefits and to give direct visitors to those pages somewhere to go other than a 404 error page.
Also, request the removal of the URLs of the deleted pages within Google Webmaster Tools to speed up the de-indexing of them. (Will need a NoIndex robots tag on the pages, and/or a Disallow in the robots.txt file).
Regards
Simon
-
I would go with your second Idea, "Delete them and be done with it or make them 301 re-direct to a nicer generic page until they are removed".
Dont want to loose any link back juice you may have.
Hope this helps. ; )
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need help with best practices on eliminating old thin content blogs.
We have about 100 really old blog posts that are nothing more than a short trip review w/ images. Consequently these pages are poor quality. Would best practices be to combine into one "review page" per trip, reducing from 100 to about 10 better pages and implement redirects? Or is having more pages better with less redirects? We only have about 700 pages total. Thanks for any input!
Intermediate & Advanced SEO | | KarenElaine0 -
Page not being ranked properly
Hi, Wondering if someone could possibly shed some light on why some of our pages are not being ranked properly on Google. For example this page https://www.mypetzilla.co.uk/dog-breeds Keyword "Dog Breeds" we can't be found on and we are absolutely baffled why? Could it be that we are listing all 100 and something dog breeds on one page? Should we introduce pagination or load more as user scrolls down. This page has been up for at least 4 years. Any suggestion or advice would be much appreciated. Many thanks
Intermediate & Advanced SEO | | Mypetzilla0 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Category pages
I am a very basic question on managing categories in WordPress. We have an Android website, and we cover news, rumors, tips and tricks about new devices. We have been creating categories for the new devices or at least for the popular ones which are launched every year, and link to them internally with the hope that it would improve the page authority and ranking. For example, we have a category page for Moto X, another one for Moto X (2014) and one more for Moto X (2015). One of the reasons for creating a category was to ensure that it is easier for readers to get information about a particular device rather than going to a category page that has information about all the models. However, the problem with their strategy we're now realizing is that it means we have to build page authority for the new category page from scratch, which can take time. So we are thinking of reusing the same category for multiple models. So reuse the Moto X category page for Moto X (2016). However, we are not sure if it would be right approach as we would be linking to the same category page with different anchor texts. So while it would be good to reuse a page rather than rebuild the page authority from scratch, would we be diluting the authority for the main keyword by using it for different models. I would love to hear your thoughts on how we should be handling categories and internal links in this case.
Intermediate & Advanced SEO | | Gautam0 -
Duplicate Pages #!
Hi guys, Currently have duplicate pages accross a website e.g. https://archierose.com.au/shop/cart**#!** https://archierose.com.au/shop/cart The only difference is the URL 1 has a hashtag and exclamation tag. Everything else is the same. We were thinking of adding rel canonical tags on the #! versions of the page to the correct URLs. But Google doens't seem to be indexing the #! versions anyway. Does anyone know why this is the case? If Google is not indexing them, is there any point adding rel canonical tags? Cheers, Chris https://archierose.com.au/shop/cart#!
Intermediate & Advanced SEO | | jayoliverwright0 -
Best way to remove low quality paginated search pages
I have a website that has around 90k pages indexed, but after doing the math I realized that I only have around 20-30k pages that are actually high quality, the rest are paginated pages from search results within my website. Every time someone searches a term on my site, that term would get its own page, which would include all of the relevant posts that are associated with that search term/tag. My site had around 20k different search terms, all being indexed. I have paused new search terms from being indexed, but what I want to know is if the best route would be to 404 all of the useless paginated pages from the search term pages. And if so, how many should I remove at one time? There must be 40-50k paginated pages and I am curious to know what would be the best bet from an SEO standpoint. All feedback is greatly appreciated. Thanks.
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Best practice to avoid cannibalization of internal pages
Hi everyone, I need help from the best SEO guys regarding a common issue : the cannibalization of internal pages between each other. Here is the case : Let's say I run the website CasualGames.com. This website provides free games, as well as articles and general presentation about given categories of Casual Games. For instance, for the category "Sudoku Games", the structure will be : Home page of the game : http://www.casualgames.com/sudoku/ Free sudoku game listings : (around 100 games listed) http://www.casualgames.com/sudoku/free/ A particular sudoku game : http://www.casualgames.com/sudoku/free/game-1/ A news regarding sudoku games : http://www.casualgames.com/sudoku/news/title The problem is that these pages seem to "cannibalize" each other. Explanation : In the SERPS, for the keyword "Casual Games", the home page doesn't appear well ranked and some specific sudoku games page (one of the 100 games) are better ranked although they are "sub-pages" of the category.. Same for the news pages : a few are better ranked than the category page.. I am kind of lost.. Any idea what would be the best practice in this situation? THANKS a LOT.
Intermediate & Advanced SEO | | laboiteac
Guillaume0 -
Best linking practice for international domains
SEOMoz team, I am wondering that in the days of Panda and Penguin SEOs have an opinion on how to best link between international domains for a web page property. Let's say you have brandname.DE (German site) brandname.FR (French site) brandname.CO.UK (British site) Right now we are linking form each site on the page to the other two language sites to make users aware of the translated version of the site which obviously make it a site wide link which seems to be lately disencouraged by Google. Did anyone out there have any ideas how to strategically interlink between international domains that represent language versions of a web site? /PP
Intermediate & Advanced SEO | | tomypro0