What to do with old, outdated and light content on a blog?
-
So there's a blog I recently took over - that over the past 2 years has great content. However, with their 800+ published posts.
I'd say that 250-300 posts are light in content, that's nothing more than a small paragraph with no real specificity on what its about - more like general updates.
Now what would best practice be; optimizing all of the posts or deleting the posts and 301'ing the URL to another post/the root?
-
Nope - minimal to no traffic
-
#1 Users. Anyone reading them? If not, no point for them to exist.
-
That's what I was leaning towards. A lot of this content links out to support pages on other sites - so I was a bit weary about losing link juice to other pages. However, I think the content and pages aren't authoritative enough to really show any loss from the dropped links.
-
Both of those ideas sound exactly what i would do, if they have credible posts that are valuable to their users then keep them and add to them, maybe share them on social media once you add more relevant content. Or if they really are not anything useful i would delete and 301 to something similar.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Condensing content for web site redesign
We're working on a redesign and are wondering if we should condense some of the content (as recommended by an agency), and if so, how that will affect our organic efforts. Currently a few topics have individual pages for each section, such as (1) Overview (2) Symptoms and (3) Treatment. For reference, the site has a similar structure to http://www.webmd.com/heart-disease/guide/heart-disease-overview-fact. Our agency has sent us over mock-ups which show these topics being condensed into one and using a script/AJAX to display only the content that is clicked on. Knowing this, if we were to choose this option, that would result in us having to implement redirects because only one page would exist, instead of all three. Can anyone provide insight into whether we should keep the topic structure as is, or if we should take the agency's advice and merge all the topic content? *Note: The reason the agency is pushing for the merging option is because they say it helps with page load time. Thank you in advance for any insight! Tcd5Wo1.jpg
Algorithm Updates | | ATShock1 -
Creating Content for Semantic search?
Need some good examples of semantic search friendly content. I have been doing a lot of reading on the subject, but have seen no real good examples of 'this is one way to structure it'. Lots of reading on the topic from an overall satellite perspective, but no clear cut examples I could find of "this is the way the pieces should be put together in a piece of content and this is the most affective ways to accomplish it". **What I know: ** -It needs to answer a question that precludes the 'keyword being used' -It needs to or should be connected to authorship for someone in that topic industry -It should incorporate various social media sources as reference to the topic -It should link out to authoritative resources on the topic -It should use some structured data markup Here is a great resource on the important semantic search pieces: http://www.seoskeptic.com/semantic-seo-making-shift-strings-things/ ,but I want to move past the research into creating the content that will make the connections needed to get the content to rank. I know Storify is an excellent medium to accomplish this off page, but only gives no follow attribution to the topic creator and links their in. I am not a coder, but a marketer and creating the backend markup will really take me out of my wheel house. I don't want to spend all of my time flailing with code when I should be creating compelling semantic content. Any helpful examples or resources welcome. Thanks in advance.
Algorithm Updates | | photoseo10 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Does adding lots of new content on a site at one time actually hurt you?
When speaking with a client today, he made the comment that he didn't want all of the new content we'd been working to be added to the site all at once for fear that he would get penalized for flooding the site with new content. I don't have any strong data to confirm or refute the claim, is there any truth to it?
Algorithm Updates | | JordanRussell0 -
Effectiveness of a Guest Blog if it is not getting you any traffic
Hi, I have been paying adequate attention to Guest Blogging for the past 4-5 months. I have posted good quality articles on some decently ranked websites. Of late I managed to get a guest post on a website with a page rank of 4. But I didn't get any referral traffic from it. My concern was will the Guest Blog also contribute to my Organic Search ? Will google give me any points if my links on the Guest Blog are not generating any traffic ? I was under the impression that Guest Blogs will work in your favour and will help in improving your search traffic only if you have visitors from that particular page. Any comments on this ? Eric
Algorithm Updates | | EricMoore0 -
Duplicate content advice
Im looking for a little advice. My website has always done rather well on the search engines, although it have never ranked well for my top keywords on my main site as they are very competitive, although it does rank for lots of obscure keywords that contain my top keywords or my top keywords + City/Ares. We have over 1,600 pages on the main site most with unique content on, which is what i attribute to why we rank well for the obscure keywords. Content also changes daily on several main pages. Recently we have made some updates to the usability of the site which our users are liking (page views are up by 100%, time on site us up, bounce rate is down by 50%!).
Algorithm Updates | | jonny512379
However it looks like Google did not like the updates....... and has started to send us less visitors (down by around 25%, across several sites. the sites i did not update (kind of like my control) have been unaffected!). We went through the Panda and Penguin updates unaffected (visitors actually went up!). So i have joined SEOmoz (and loving it, just like McDonald's). I am now going trough all my sites and making changes to hopefully improve things above and beyond what we used to do. However out of the 1,600 pages, 386 are being flagged as duplicate content (within my own site), most/half of this is down to; We are a directory type site split into all major cities in the UK.
Cities that don't have listings on, or cities that have the same/similar listing on (as our users provide services to several cities) are been flagged as duplicate content.
Some of the duplicate content is due to dynamic pages that i can correct (i.e out.php?***** i will noindex these pages if thats the best way?) What i would like to know is; Is this duplicate content flags going to be causing me problems, keeping in mind that the Penguin update did not seem to affect us. If so what advise would people here offer?
I can not redirect the pages, as they are for individual cities (and are also dynamic = only one physical page but using URL rewriting). I can however remove links to cities with no listings, although Google already have these pages listed, so i doubt removing the links from my pages and site map will affect this. I am not sure if i can post my URL's here as the sites do have adult content on, although is not porn (we are an Escort Guide/Directory, with some partial nudity). I would love to hear opinions0 -
Large number of thin content pages indexed, affect overall site performance?
Hello Community, Question on negative impact of many virtually identical calendar pages indexed. We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000. Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000. When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report. So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well. Thanks for your feedback. Chris
Algorithm Updates | | cogbox0 -
What is considered duplicate content in an ecommerce website that offers the same product for retail and wholesale purchasing?
I have an ecommerce website that offers retail and wholesale products which are identical, of course with the exception of pricing. My concern is duplicate content. If the same product is offered under both the retail and wholesale category, and described identically, with the exception of price, metadata and a few words, is that considered duplicate content and would both pages be disregarded by the robots? Is it best to avoid the same description for that one product under the two separate categories? Thanks for all your help!
Algorithm Updates | | flaca0