Removing large section of content with traffic, what is best de-indexing option?
-
If we are removing 100 old urls (archives of authors that no longer write for us), what is the best option?
-
we could 301 traffic to the main directory
-
de-index using no-index, follow
-
404 the pages
Thanks!
-
-
We have a busy blog with lots of very temporary content. About once a year we delete a couple thousand posts. However, before we do that we look at analytics to see which pages are pulling traffic from search.
For pages that receive regular traffic, we first try to recycle the post. If that is not possible we create a page with evergreen content so that a 301 redirect can be done. All other pages are 301 redirected to the homepage of the blog.
-
The best option would be to take the pages on a case-by-base basis.
Each page should ideally cover a specific focus. For example if an article discusses "the grapefruit diet" you can redirect the URL to another page on your site that discusses dieting. If you understand what users want from the page and have similar content, help your users make the connection they seek rather then 404'ing the page or dumping them on your home page.
If you do not have any similar content, check the page's backlinks. If there are none then I would let the page 404. This would be a good time to ensure your site's 404 page is helpful. A navigation bar and search option would be good to show on the page.
If the page has valuable backlinks and there is no related content on your site, perhaps the page still has value even though the author is gone. If you still wish to remove the page you should redirect the traffic to your home page or main directory as you suggested.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix Duplicate Content Warnings on Pagination? Indexed Pagination?
Hi all! So we have a Wordpress blog that properly has pagination tags of rel="prev" and rel="next" set up for pages, but we're still getting crawl errors with MOZ for duplicate content on all of our pagination pages. Also, we are having all of our pages indexed as well. I'm talking pages as deep as page 89 for the home page. Is this something I should ignore? Is it hurting my SEO potentially? If so, how can I start tackling it for a fix? Would "noindex" or "nofollow" be a good idea? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | jampaper0 -
Same content on other domain owned by de company. Canonical is not working
Hi! I am analyzing a website right now. It's a school, let's name it NEWSCHOOL. This school is owned by other school, let's name it, BIGSCHOOL NEWSCHOOL is specialized in tourism degrees, and the BIGSCHOOL is a bigger and older one with a lot of different degrees. What happens is that NEWSCHOOL has a course, let's name it TOURISM DEGREE.
Intermediate & Advanced SEO | | teconsite
BIGSCHOOL has that course too, with the same content, trying to help to promote the content, because this school is older, well known and has a consolidated brand internationally. BIGSCHOOL, has placed a canonical tag, telling Google that content comes from NEWSCHOOL. What is happening is that the result of newschool is beeing omited by google. The first result is the BIGSCHOOL content, and then a lot of training portals, where the degree content is too to increase its visibility. So, I would like to know, how can we do to say google that the content that it should show is the one of NEWSCHOOL and not the one in BIGSCHOOL. It's pretty clear that Google knows that those portals are closed related, because it is omitting the NEWSCHOOL results. I know that we can send a link from the content area from one portal to the other in the content we want. But... would it solve the problem... and y we have to repeat that for each degree, woudn't it be a little dangerous? Would like to know your points of view! Thanks!0 -
Website not being indexed after relocation
I have a scenario where a 'draft' website was built using Google Sites, and published using a Google Sites sub domain. Consequently, the 'same' website was rebuilt and published on its own domain. So effectively there were two sites, both more or less identical, with identical content. The first website was thoroughly indexed by Google. The second website has not been indexed at all - I am assuming for the obvious reasons ie. that Google is viewing it as an obvious rip-off of the first site / duplicate content etc. I was reluctant to take down the first website until I had found an effective way to resolve this issue long-term => ensuring that in future Google would index the second 'proper' site. A permanent 301 redirect was put forward as a solution - however, believe it or not, the Google Sites platform has no facility for implementing this. For lack of an alternative solution I have gone ahead and taken down the first site. I understand that this may take some time to drop out of Google's index, however, and I am merely hoping that eventually the second site will be picked up in the index. I would sincerely appreciate an advice or recommendations on the best course of action - if any! - I can take from here. Many thanks! Matt.
Intermediate & Advanced SEO | | collectedrunning0 -
De-indexing product "quick view" pages
Hi there, The e-commerce website I am working on seems to index all of the "quick view" pages (which normally occur as iframes on the category page) as their own unique pages, creating thousands of duplicate pages / overly-dynamic URLs. Each indexed "quick view" page has the following URL structure: www.mydomain.com/catalog/includes/inc_productquickview.jsp?prodId=89514&catgId=cat140142&KeepThis=true&TB_iframe=true&height=475&width=700 where the only thing that changes is the product ID and category number. Would using "disallow" in Robots.txt be the best way to de-indexing all of these URLs? If so, could someone help me identify how to best structure this disallow statement? Would it be: Disallow: /catalog/includes/inc_productquickview.jsp?prodID=* Thanks for your help.
Intermediate & Advanced SEO | | FPD_NYC0 -
Yoast SEO Plugin: To Index or Not to index Categories?
Taking a poll out there......In most cases would you want to index or NOT index your category pages using the Yoast SEO plugin?
Intermediate & Advanced SEO | | webestate0 -
Best to Post Dynamic Content (Listings) under "Posts" in Wordpress?
My commercial real estate web site is being migrated to Wordpress from Drupal. Is it advisable to place dynamic content that will use taxonomy under "Posts" ? Listings will be changed every few months and there could be anywhere from several hundred to several thousand of them on the site. Developers have given me different advice. One has been adamant that listings and neighborhood pages (there will be about 25 neighborhood pages) should not be in the post section which is to be strictly reserved for blog entries. The last thing I want is to create a site structure which is unfriendly to SEO!!!! I would very much appreciate the perspective of anyone proficient with Wordpress and SEO. Thanks!!!
Intermediate & Advanced SEO | | Kingalan1
Alan Rosinsky0 -
Best way to stop pages being indexed and keeping PageRank
If for example on a discussion forum, what would be the best way to stop pages such as the posting page (where a user posts a topic or message) from being indexed AND not diluting PageRank too? If we added them to the Disallow on robots.txt, would pagerank still flow through the links to those blocked pages or would it stay concentrated on the linking page? Your ideas and suggestions will be greatly appreciated.
Intermediate & Advanced SEO | | Peter2640 -
Duplicate page content
Hi. I am getting error of having duplicate content on my website and pages its showing there are: www.mysitename.com www.mysitename.com/index.html As my best knowledge it only one page, I know this can be solved with some conical tag used in header, but do not know how. Can anyone please tell me about that code or any other way to get this solved. Thanks
Intermediate & Advanced SEO | | onlinetraffic0