Removing large section of content with traffic, what is best de-indexing option?
-
If we are removing 100 old urls (archives of authors that no longer write for us), what is the best option?
-
we could 301 traffic to the main directory
-
de-index using no-index, follow
-
404 the pages
Thanks!
-
-
We have a busy blog with lots of very temporary content. About once a year we delete a couple thousand posts. However, before we do that we look at analytics to see which pages are pulling traffic from search.
For pages that receive regular traffic, we first try to recycle the post. If that is not possible we create a page with evergreen content so that a 301 redirect can be done. All other pages are 301 redirected to the homepage of the blog.
-
The best option would be to take the pages on a case-by-base basis.
Each page should ideally cover a specific focus. For example if an article discusses "the grapefruit diet" you can redirect the URL to another page on your site that discusses dieting. If you understand what users want from the page and have similar content, help your users make the connection they seek rather then 404'ing the page or dumping them on your home page.
If you do not have any similar content, check the page's backlinks. If there are none then I would let the page 404. This would be a good time to ensure your site's 404 page is helpful. A navigation bar and search option would be good to show on the page.
If the page has valuable backlinks and there is no related content on your site, perhaps the page still has value even though the author is gone. If you still wish to remove the page you should redirect the traffic to your home page or main directory as you suggested.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog Content In different language not indexed - HELP PLEASE!
I have an ecommerce site in English and a blog that is in Malay language. We have started the blog 3 weeks ago with about 20-30 articles written. Ecommerce is using MAgento CMS and Blog is wordpress. URL Structure: Ecommerce: www.example.com Blog: www.example.com/blog Blog category: www.example.com/blog/category/ However, google is indexing all pages including blog category but not individual post that is in Malay language. What could be the issue here? PLEASE help me!
Intermediate & Advanced SEO | | WayneRooney0 -
350 (Out the 750) Internal Links Listed by Webmaster Tools Dynamically Generated-Best to Remove?
Greetings MOZ Community: When visitors enter real estate search parameters in our commercial real estate web site, the parameters are somehow getting indexed as internal links in Google Webmaster Tools. About half are 700 internal links are derived from these dynamic URLs. It seems to me that these dynamic alphanumeric URL links would dilute the value of the remaining static links. Are the dynamic URLs a major issue? Are they high priority to remove? The dynamic URLs look like this: /listings/search?fsrepw-search-neighborhood%5B%5D=m_0&fsrepw-search-sq-ft%5B%5D=1&fsrepw-search-price-range%5B%5D=4&fsrepw-search-type-of-space%5B%5D=0&fsrepw-search-lease-type=1 These URLs do not show up when a SITE: URL search is done on Google!
Intermediate & Advanced SEO | | Kingalan10 -
What may cause a page not to be indexed (be de-indexed)?
Hi All, I have a main category page, a landing page, that does not appear in the SERPS at all (even if I serach for a whole sentence from it). This page once ranked high. What may cause such a punishment for a specific page? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Duplicate content mess
One website I'm working with keeps a HTML archive of content from various magazines they publish. Some articles were repeated across different magazines, sometimes up to 5 times. These articles were also used as content elsewhere on the same website, resulting in up to 10 duplicates of the same article on one website. With regards to the 5 that are duplicates but not contained in the magazine, I can delete (resulting in 404) all but the highest value of each (most don't have any external links). There are hundreds of occurrences of this and it seems unfeasible to 301 or noindex them. After seeing how their system works I can canonical the remaining duplicate that isn't contained in the magazine to the corresponding original magazine version - but I can't canonical any of the other versions in the magazines to the original. I can't delete the other duplicates as they're part of the content of a particular issue of a magazine. The best thing I can think of doing is adding a link in the magazine duplicates to the original article, something along the lines of "This article originally appeared in...", though I get the impression the client wouldn't want to reveal that they used to share so much content across different magazines. The duplicate pages across the different magazines do differ slightly as a result of the different Contents menu for each magazine. Do you think it's a case of what I'm doing will be better than how it was, or is there something further I can do? Is adding the links enough? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
Duplicate Content in News Section
Our clients site is in the hunting niche. According to webmaster tools there are over 32,000 indexed pages. In the new section that are 300-400 news posts where over the course of a about 5 years they manually copied relevant Press Releases from different state natural resources websites (ex. http://gfp.sd.gov/news/default.aspx). This content is relevant to the site visitors but it is not unique. We have since begun posting unique new posts but I am wondering if anything should be done with these old news posts that aren't unique? Should I use the rel="canonical tag or noindex tag for each of these pages? Or do you have another suggestion?
Intermediate & Advanced SEO | | rise10 -
How to get around Google Removal tool not removing redirected and 404 pages? Or if you don't know the anchor text?
Hello! I can’t get squat for an answer in GWT forums. Should have brought this problem here first… The Google Removal Tool doesn't work when the original page you're trying to get recached redirects to another site. Google still reads the site as being okay, so there is no way for me to get the cache reset since I don't what text was previously on the page. For example: This: | http://0creditbalancetransfer.com/article375451_influencial_search_results_for_.htm | Redirects to this: http://abacusmortgageloans.com/GuaranteedPersonaLoanCKBK.htm?hop=duc01996 I don't even know what was on the first page. And when it redirects, I have no way of telling Google to recache the page. It's almost as if the site got deindexed, and they put in a redirect. Then there is crap like this: http://aniga.x90x.net/index.php?q=Recuperacion+Discos+Fujitsu+www.articulo.org/articulo/182/recuperacion_de_disco_duro_recuperar_datos_discos_duros_ii.html No links to my site are on there, yet Google's indexed links say that the page is linking to me. It isn't, but because I don't know HOW the page changed text-wise, I can't get the page recached. The tool also doesn't work when a page 404s. Google still reads the page as being active, but it isn't. What are my options? I literally have hundreds of such URLs. Thanks!
Intermediate & Advanced SEO | | SeanGodier0 -
Webpages look like they have been de-indexed
Hi there, My webpages seem that they have been de-indexed, I have no page rank anymore for my webpages, my homepage which was a PR4, is now saying N/A, plus lots of my rankings have dropped, what check should I been making to identify that this is the case? Kind Regards
Intermediate & Advanced SEO | | Paul780 -
Best way to find broken links on a large site?
I've tried using Xenu, but this is a bit time consuming because it only tells you if the link sin't found & doesn't tell you which pages link to the 404'd page. Webmaster tools seems a bit dated & unreliable. Several of the links it lists as broken aren't. Does anyone have any other suggestions for compiling a list of broken links on a large site>
Intermediate & Advanced SEO | | nicole.healthline1