How best to approach archiving badly optimised content
-
I signed up SEO Moz about a month ago as i'm currently rebuilding my site from scratch and wanted to learn from current mistakes.
At present I use the forum software Invision Power Board to manage my site and one thing i've learnt is that it is terrible for SEO, there are so many thousands of errors listed by the crawler that it's not even worth trying to fix it.
However because it has 5 or 6 years worth of content alot of which is on Google I don't want to totally remove it, rather I would prefer to archive it of with a big banner at the top letting anybody that visits it know that it's no longer in use and pointing them to the frontpage.
I should note that it is in a subfolder already so the location of any of the links won't be changed.
So the few questions I have are:
- The forum index has alot of link juice and I would like to redirect that to the new forum index, however for archive purposes the old index still needs to be accessible.
- Some topics are very popular and appear high in Google and have alot of backlinks. The important information in these forum topics will be available elsewhere on the new rebuilt site. Again I would like to redirect both link juice and users to the new page, however being a forum topic there are tens or hundreds of pages of old comments that need to still be accessible for reference.
- There are bound to be duplicate meta title and description issues with new similarly named categories appearing both on the new site and the old forum, is this going to be that much of a problem?
So really what i'm asking is, how should I go about archiving this of without destroying content and rankings, but still making sure that the new stuff is getting the right exposure both to users and search engines alike?
-
hmm that's very interesting, i've had another look at the stats to give more insight.
150,000 visitors hit the forums from search engines every month, of these only 10 landing pages get over 1000 visitors and 168 get over 100. This amounts to 76,000 visits or roughly half.
The other half comes from visits to a whopping 16,222 different landing pages.
So whilst manually redirecting those 168 may be a manageable task it would mean the loss of half of the visitors the forums pull in.
It also doesn't deal with the problem with letting users still browse the old content whilst transferring the link juice to newer areas, since Search Engines would not take kindly to being treated differently to an actual user.
-
We have a blog that gets about 2000 to 3000 short posts per year. Most of those posts have a temporary value and have very little archive value.
The posts are filed in folders by year such as /blog/2010/ .
Once a year we run analytics to identify old posts that pull in significant traffic or old posts that have valuable links. Where possible we create a new page of evergreen content on the same subject and 301 redirect those posts.
All posts that receive very little traffic are considered to be "dead weight" on our site. We delete those posts and redirect the entire folder to the homepage of the blog with an .htaccess file at /blog/2010/.htaccess. This also reduces the size of our database.
With a forum, you might be able to delete some of the worthless material and feature some of the archival gold.
-
Thanks for the response you're right it is not a simple problem, but i'm hoping people on this forum may be able to provide some very useful tips or advice from past experience, i'm sure many will have had to tackle this sort of dilemma before.
Just looked at some stats which demonstrate it's importance, 51.76% of visits come from Google and land on the forum currently.
Given that stat it may seem ludicrous to want to change the site so drastically but as you can imagine having everything managed on a forum can become very messy compared to a custom coded site where every page and content type is tailored to its specific needs and of course keeping the urls is not going to be an option when it's all tied into "topic ids".
-
I don't think that this is a simple problem.
A good diagnosis should be considering historic traffic information, past and potential linkage structure, keywords, external links, premium content vs worthless content, usability and more.
If this forum is low value it maybe be best just redirecting everything.
However, if there is a lot of value there I would want to put some careful consideration and planning into the solution. Certainly more than the five-minute-look that most visitors to a Q&A forum are able to provide.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content: Form labels and field content
I have a site that has 500 pages, each with unique content, the only content that could be deemed the same is the 'Make Contact' form, which has the same labels and placeholder text on each page. Is this likely to cause any duplicate content penalties?
On-Page Optimization | | deployseo0 -
Duplicate content - Opencart
In my last report I have a lot of duplicate content. Duplicate pages are: http://mysite.com/product/search&filter_tag=Сваров�% http://mysite.com/product/search&filter_tag=бижу http://mysite.com/product/search&filter_tag=бижузо�%8 And a lot of more, starting with -- http://mysite.com/product/search&filter_tag= Any ideas? Maybe I should do something in robots.txt, but please tell me the exact code. Best Regards, Emil
On-Page Optimization | | famozni0 -
Duplicate content on domains we own
Hello! We are new to SEO and have a problem we have caused ourselves. We own two domains GoCentrix.com (old domain) and CallRingTalk.com (new domain that we want to SEO). The content was updated on both domains at about the same time. Both are identical with a few exceptions. Now that we are getting into SEO we now understand this to be a big issue. Is this a resolvable matter? At this point what is the best approach to handle this? So far we have considered a couple of options. 1. Change the copy, but on which site? Is one flagged as the original and the other duplicate? 2. Robots.txt noindex, nofollow on the old one. Any help is appreciated, thanks in advance!
On-Page Optimization | | CallRingTalk0 -
Pagination on related content within a subject
A client has come to us with new content and sections for their site. The two main sections are "Widget Services" - the sales pages, and "Widget Guide" - a non-commercial guide to using the widgets etc. Both the Services and Guide pages contain the same pages (red widgets, blue widgets, triangle widgets), and - here's the problem - the same first paragraph. i.e. ======== Blue widget services Blue widgets were invented in 1906 by Professor Blue. It was only a coincidence that they were blue. We stock a full range of blue widgets, we were voted best blue widget handler at widgetcon 2013. Buy one now See our guide to blue widgets here Guide to blue widgets Blue widgets were invented in 1906 by Professor Blue. It was only a coincidence that they were blue. The thing about blue widgets as they're not at all like red widgets at all. For starters, they're blue. Find more information about our blue widgets here ======== In all of these pages, the first paragraph is ~200 words and provides a great introduction to the subject, and the rest of the page is 600-800 words, making these pages unique enough to justify being different pages. We want to deal with this by declaring each page as a paginated version of a two page article on each type of widget (using rel=prev/next). Our thinking is that Google probably handles introuctions/headers on paginated content in a sensible way. Has anyone experienced this before? Is there any issues on using rel="prev" and rel="next" when they're not strictly paginated?
On-Page Optimization | | BabelPR0 -
Static content VS Dynamic changing content what is best
We have collected a lot of reviews and we want to use them on our Categories pages. We are going to be updating the top 6 reviews per categories every 4 days. There will be another page to see all of the reviews. Is there any advantage to have the reviews static for 1 or 2 weeks vs. having unique new ones pulled from the data base every time the page is refreshed? We know there is an advantage if we keep them on the page forever with long tail; however, we have created a new page with all of the reviews they can go to.
On-Page Optimization | | DoRM0 -
Duplicate content in the title
Good morning, I am developing an application that searches offers in the press. The problem I have is the follow one:
On-Page Optimization | | ofuente
When I find an offer that I have already post, I cant use the same URL because it generates duplicate content , as the URL is generated from the title. If I find two offers in different stores (for example Thomson TV) I am studying two options. The first would be to add a number at the end of the URL
http://www.offertazo.com/televisor-thomson
http://www.offertazo.com/televisor-thomson1
http://www.offertazo.com/televisor-thomson2 Another option I propose would be to add semantic data to provide value (such as the date). For example:
http://www.offertazo.com/01-12-12/televisor-thomson I appreciate your help.0 -
Fresh Content Strategy - What does it look like?
I understand the growing importance fo content freshness, but I have some questions about how to incorporate content freshness components into an existing SEO strategy: Here are some specific questions I would love some help with: If I have a specific "product or services" page that is properly optimized, and getting a decent amount of traffic, would I benefit from updating/modifying the content on a routine basis to improve rankings? In general, should I be considering an occasional re-fresh of content on my site even if I don;t necessarily have anything new to say? For my homepage, if I am pulling in headlines from various news and events sections within my own site, and those sections are updated pretty frequently, is my homepage going to be viewed as fresh when the site gets re-crawled? In other words, is updating my homepage via rss feeds that pull from content areas from within my site keeping my homepage "fresh"? Thanks!
On-Page Optimization | | AmyLB0 -
Duplicate Title & Content in WordPress
I'm getting a lot of Crawl Errors due to duplicate content and duplicate title because of category and tag posts in WordPress. I rebuilt the sitemap and said to exclude category and tags, should that clear up the issue? I've also went through and did NO INDEX and NO FOLLOW for all categories and posts. Any thoughts on this issue?
On-Page Optimization | | seantgreen0