How best to approach archiving badly optimised content
-
I signed up SEO Moz about a month ago as i'm currently rebuilding my site from scratch and wanted to learn from current mistakes.
At present I use the forum software Invision Power Board to manage my site and one thing i've learnt is that it is terrible for SEO, there are so many thousands of errors listed by the crawler that it's not even worth trying to fix it.
However because it has 5 or 6 years worth of content alot of which is on Google I don't want to totally remove it, rather I would prefer to archive it of with a big banner at the top letting anybody that visits it know that it's no longer in use and pointing them to the frontpage.
I should note that it is in a subfolder already so the location of any of the links won't be changed.
So the few questions I have are:
- The forum index has alot of link juice and I would like to redirect that to the new forum index, however for archive purposes the old index still needs to be accessible.
- Some topics are very popular and appear high in Google and have alot of backlinks. The important information in these forum topics will be available elsewhere on the new rebuilt site. Again I would like to redirect both link juice and users to the new page, however being a forum topic there are tens or hundreds of pages of old comments that need to still be accessible for reference.
- There are bound to be duplicate meta title and description issues with new similarly named categories appearing both on the new site and the old forum, is this going to be that much of a problem?
So really what i'm asking is, how should I go about archiving this of without destroying content and rankings, but still making sure that the new stuff is getting the right exposure both to users and search engines alike?
-
hmm that's very interesting, i've had another look at the stats to give more insight.
150,000 visitors hit the forums from search engines every month, of these only 10 landing pages get over 1000 visitors and 168 get over 100. This amounts to 76,000 visits or roughly half.
The other half comes from visits to a whopping 16,222 different landing pages.
So whilst manually redirecting those 168 may be a manageable task it would mean the loss of half of the visitors the forums pull in.
It also doesn't deal with the problem with letting users still browse the old content whilst transferring the link juice to newer areas, since Search Engines would not take kindly to being treated differently to an actual user.
-
We have a blog that gets about 2000 to 3000 short posts per year. Most of those posts have a temporary value and have very little archive value.
The posts are filed in folders by year such as /blog/2010/ .
Once a year we run analytics to identify old posts that pull in significant traffic or old posts that have valuable links. Where possible we create a new page of evergreen content on the same subject and 301 redirect those posts.
All posts that receive very little traffic are considered to be "dead weight" on our site. We delete those posts and redirect the entire folder to the homepage of the blog with an .htaccess file at /blog/2010/.htaccess. This also reduces the size of our database.
With a forum, you might be able to delete some of the worthless material and feature some of the archival gold.
-
Thanks for the response you're right it is not a simple problem, but i'm hoping people on this forum may be able to provide some very useful tips or advice from past experience, i'm sure many will have had to tackle this sort of dilemma before.
Just looked at some stats which demonstrate it's importance, 51.76% of visits come from Google and land on the forum currently.
Given that stat it may seem ludicrous to want to change the site so drastically but as you can imagine having everything managed on a forum can become very messy compared to a custom coded site where every page and content type is tailored to its specific needs and of course keeping the urls is not going to be an option when it's all tied into "topic ids".
-
I don't think that this is a simple problem.
A good diagnosis should be considering historic traffic information, past and potential linkage structure, keywords, external links, premium content vs worthless content, usability and more.
If this forum is low value it maybe be best just redirecting everything.
However, if there is a lot of value there I would want to put some careful consideration and planning into the solution. Certainly more than the five-minute-look that most visitors to a Q&A forum are able to provide.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we need to worry about internal duplicate content?
Hi, I have a question about internal duplicate content. We have a catalogue of around 4000 products. Most of these do have individual descriptions but for most of the products they contain a generic summary that includes a sentence to begin with that includes each product name. We're currently working on descriptions for each product, but as you can imagine it's quite a chore. I was wondering if there are actually any penalties for this or whether we can ignore the crawl errors from the moz report? Thanks in Advance!
On-Page Optimization | | 10dales0 -
Duplicate Page Content
Hi, I am new to the MOZ Pro community. I got the below message for many of my pages. We have a video site so all content in the page except the video link would be different. How can i handle such pages. Can we place adsense AD's on these pages? Duplicate Page Content Code and content on this page looks similar or identical to code and content on other pages on your site. Search engines may not know which pages are best to include in their index and rankings. Common fixes for this issue include 301 redirects, using the rel=canonical tag, and using the Parameter handling tool in Google Webmaster Central. For more information on duplicate content, visit http://moz.com/learn/seo/duplicate-content. Please help me to know how to handle this.. Regards
On-Page Optimization | | Nettv0 -
E-Commerce Site - Duplicate Content
We run an e-commerce site with about 250,000 SKUs. Certain items, such as a micro USB car charger, will be applicable to several different phones. Example: http://www.wirelessemporium.com/p-165787-samsung-galaxy-proclaim-illusion-sch-i110-heavy-duty-car-charger.asp http://www.wirelessemporium.com/p-165856-sony-xperia-ion-4g-lte-att-heavy-duty-car-charger.asp As one can imagine with so many items, unique content for each item description page can be a challenge. What would be the best way to address this on a large scale?
On-Page Optimization | | eugeneku0 -
What's the best practice for handling duplicate content of product descriptions with a drop-shipper?
We write our own product descriptions for merchandise we sell on our website. However, we also work with drop-shippers, and some of them simply take our content and post it on their site (same photos, exact ad copy, etc...). I'm concerned that we'll loose the value of our content because Google will consider it duplicated. We don't want the value of our content undermined... What's the best practice for avoiding any problems with Google? Thanks, Adam
On-Page Optimization | | Adam-Perlman0 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0 -
Duplicate Content
We offer Wellness programs for dogs and cats. A lot of the information is the same except for specifics that relate to young vs. senior pets. I have these different pages: Senior Wellness Kitten Wellness Puppy Wellness Adult Wellness Can each page have approx. 75% of the same text? Or should I rewrite each page so the information (though the same) appears unique.
On-Page Optimization | | PMC-3120870 -
Is it better to drip feed content?
Hi All, I've assembled a collection of 5 closely related articles each about 700 words for publishing by linking to them from on one of my pages and would appreciate some advice on the role out of these articles. Backround: My site is a listings based site and a majority of the content is published on my competitors sites too. This is because advertisers are aiming to spread there adverts wide with the hope of generating more responses. The page I'm targeting ranks 11th but I would like to link it to some new articles and guides to beef it up a bit. My main focus is to rank better for the page that links to these articles and as a result I write up an introduction to the article/guide which serves as my unique content. Question: Is it better to drip feed the new articles onto the site or would it be best to get as much unique content on as quickly as possible to increase the ratio of unique content vs. external duplicate content on the page that links to these articles**?** Thank you in advance.
On-Page Optimization | | Mulith0 -
On-page optimisation for CMS based sites
With so many sites being based on a CMS, and with so many hundreds of different CMS out there, as SEO consultants how do you recommend dealing with on-page optimisation for a client where you discover their site is built with a CMS you have not previously used (or even heard of!)
On-Page Optimization | | bjalc20110