What should be done with old news articles?
-
Hello,
We have a portal website that gives information about the industry we work in. This website includes various articles, tips, info, reviews and more about the industry.We also have a news section that was previously indexed in Google news but is not for the past few month.The site was hit by Panda over a year ago and one of the things we have been thinking of doing is removing pages that are irrelavant/do not provide added value to the site.Some of these pages are old news articles posted over 3-4 years ago and that have had hardly any traffic to.All the news articles on the site are under a /archive/ folder sorted by month and year, so for example a url for a news item from April 2010 would be /archive/042010/article-nameMy question is do you think removing such news articles would benefit the site helping it get out of Panda (many other things have been done in the site as well), if not what is the best suggested way to keep these articles on the site in a way which Google indexes them and treats them well.thx
-
Basically I don't see a reason to remove old news articles from a site, as it makes sense to still have an archive present. The only reason I could think of to remove them is if they are duplicate versions of texts that have originally been published somewhere else. Or if the quality is really crap...
-
if the articles are good - then there just might be value to the user . Depending on the niche / industry those old articles could be very important.
Google dosen't like those as you probably have a lot of impression but no clicks (so mainly no traffic) or maybe the "score" is bad (bounce rate - not Google analytics bounce rate, but Google's bounce rate - if they bounce to serps that is).
Since you got hit by panda, in my opinion, I see two options:
1. No index those old pages. The users can still get tho those by navigation, site search etc but google won't see them. Google is fine with having content (old, poor, thin etc) if it's not in the index. I work with a site that has several million pages and 80% is no index - everything is fine now (they also got hit by Panda).
2. Merge those pages into rich, cool, fresh topic pages (see new york time topic pages sample - search for it - I think there is also an seomoz post - a whiteboard friday about it). This is a good approach and if you manage to merge those old pages with some new content you will be fine. Topic pages are great as an anti panda tool !
If you merge the pages into topic pages do that based on a simple flow:
1. identify a group of pages that covers the same topic.
2. identify the page that has the highest authority of all.
3. Change this page into the topic page - keep the url.
4. Merge the other into this page (based on your new topic page structure and flow)
5. 301 redirect the others to this one
6. build a separat xml sitemaps with all those pages and load it up to WMT. Monitor it.
7. Build some links to some of those landing pages, get some minimum social signals to those - to a few (depending on the number). Build an index typoe of page with those topic pages or some of them (user friendly one/ ones) and use those as target to build some links to send the 'love'.
Hope it helps - just some ideas.
-
I do think that any site should remove pages that are not valuable to users.
I would look for the articles that have external links pointed at them and 301 those to something relevant. The rest, you could simply remove and let them return a 404 status. Just make sure all internal links pointing at them are gone. You don't want to lead people to a 404 page.
You could consider putting /archive/ in your robots.txt file if you think the pages have some value to users, but not to the engines. Or putting a no index tag on each page in that section.
If you want to keep the articles on the site, available to both google and users, you have to make sure they meet some of this basic criteria.
- Mostly Unique Content
- Moderate length.
- Good content to ad ratio.
- Content the focus on the page (top/center)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old site penalised, we moved: Shall we cut loose from the old site. It's curently 301 to new site.
Hi, We had a site with many bad links pointing to it (.co.uk). It was knocked from the SERPS. We tried to manually ask webmasters to remove links.Then submitted a Disavow and a recon request. We have since moved the site to a new URL (.com) about a year ago. As the company needed it's customer to find them still. We 301 redirected the .co.uk to the .com There are still lots of bad links pointing to the .co.uk. The questions are: #1 Do we stop the 301 redirect from .co.uk to .com now? The .co.uk is not showing in the rankings. We could have a basic holding page on the .co.uk with 'we have moved' (No link). Or just switch it off. #2 If we keep the .co.uk 301 to the .com, shall we upload disavow to .com webmasters tools or .co.uk webmasters tools. I ask this because someone else had uploaded the .co.uk's disavow list of spam links to the .com webmasters tools. Is this bad? Thanks in advance for any advise or insight!
Intermediate & Advanced SEO | | SolveWebMedia0 -
Low Page Authority in existing article in blog Any Ideas to improve it?
Im managing a blog that has a lot of articles with Page Authority 1.I have already checked with On-page Grader that these articles are Grade A, so they have the SEO structure perfect and would like to know any ideas to get this Page Authority rise in existing articles that are already written, like changes that can effectively be made this page authority get higher. Thanks in advance and regards, Jorge Pascual
Intermediate & Advanced SEO | | goperformancelabs0 -
Is it worth redirecting an old domain name which was hacked to my new website?
I had a website which got hacked and malware added to it. I have since closed that website down but I still have the domain name. That domain name prior to the malware was incredibly well ranking for its niche and had a good range of high quality links to it and a domain age of 6 years. I'm now creating a new website which is similar to the old one (the same but with a different platform and layout). Is it a good or bad idea to redirect the old domain name to the new website?
Intermediate & Advanced SEO | | james.rose0 -
Rich Snippets Ratings For Q&A Discussions, Articles,
Hi, I'm looking for how I can use a star rating for a q&a discussion or article/blog post to achieve a rich snippets search result. I'm thinking about a user rating for "Was this helpful?" 1 to 5 stars. As I look at schema.org and do and other reading on it, it looks like it's possible to rate only a set group of content types, blogs and discussions not included. However, I've seen rich snippets ratings in SERPs for blog posts, like this example https://www.google.com/search?q=erp+implementation+challenges&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a#q=panorama+consulting+blog&client=firefox-a&hs=gId&hl=en&rls=org.mozilla:en-US:official&ei=QmCBUYLLCOfwiwKHhIAQ&start=20&sa=N&bav=on.2,or.r_cp.r_qf.&bvm=bv.45921128,d.cGE&fp=eb2f15e2a98a4631&biw=2144&bih=995 On page, it looks like they used some simple span tags. So, my question is, which content type category does that fit into for rating and is that strategy safe enough going forward? Also, are there more steps to making this work? It it is okay to have users rate the helpfulness of a discussion or article and get rich snippets, I'd kinda like to do it. Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Multiple, Partial Redirecting URLs from old SEO company
Received quite a surprise when I gained access to the Google webmaster account and saw 4 domains that are link to my clients domain and the number of links for each domain range between 10,000 and 90,000. Come to find out this was a result of their former agency. The business is very local central. I will use the example of a burger place. They main site is burgers.com and burger places are listed by city and state. Their former agency bought several domains like californiaburgers.com and duplicated the listings for that state on this domain. You can view certain pages of the second domain, but the home page is redirected as are most of the city pages with 301s to the main burgers.com domain. However, there are pages on the additional domains that do not redirect, as they are not duplicated on the main domain so nowhere to redirect. Google has only found four of them but looks like there could be at least 50. Pages that are not redirected are indexed by the engines - but not ranking (at least not well). There is a duplicate content issue, although "limited" in the sense that it really is just the name of the business, address and phone number - there is not much to these listings. What is the best approach to overcome? Right now GWT is showing over 300,000 links, however at least 150,000 to 200,000 of that is from these domains.
Intermediate & Advanced SEO | | LeverSEO0 -
Article Submissions
With all of the recent changes are there any article submission websites worth considering?
Intermediate & Advanced SEO | | casper4340 -
301 redirect hell.... How do you de-commission an old site
Hi SEO experts: We operate a vacation rental website and around 1 year ago moved to a different platform. Because our pages are arranged by location (what we refer to as Locales) we need to put 301 redirects for all the old locale pages. So for example: www.example.com/__Skeggness.cfm redirects to www.example/com/vacation-rentals/locale/skeggness But here's the problem: We can't seem to get Google to drop those old __{locale_name}.cfm pages... even after over 12-months of the new site going live! Other clues we've noticed: The old underscore URLs show up in our SERP sub-links Sometimes google shows the new page title and description but attributes it to the __{locale_name}.cfm URL (aghh!!!) One suggestion we received was to use the URL removal tool in Google WMT.... But given we have 1,000's of locales i don't see that as being affective. Questions: Any suggestions on how to get Google to drop these old URLs and use the new ones? Is this situation hurting our SEO? Or do you think its benign... and I should just take a deep breath.... and relax at little more...
Intermediate & Advanced SEO | | AABAB0 -
Best places to seed articles UK based
Hi I have written 2 articles for 2 seperate businesses and markets. One article is a Top 10 tips on choosing a conservatory How would I go about promoting and seeding this around related home improvement websites around the UK, use stumbleupon? 2. I also have recipes for a restaurant which I need to seed and promote online in order to gain links and promote the restaurant. Again which methods are best in finding sources to list these recipes and to related blogs etc Many Thanks
Intermediate & Advanced SEO | | ocelot0