Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I delete 100s of weak posts from my website?
-
I run this website: http://knowledgeweighsnothing.com/
It was initially built to get traffic from Facebook. The vast majority of the 1300+ posts are shorter curation style posts. Basically I would find excellent sources of information and then do a short post highlighting the information and then link to the original source (and then post to FB and hey presto 1000s of visitors going through my website). Traffic was so amazing from FB at the time, that 'really stupidly' these posts were written with no regard for search engine rankings.
When Facebook reach etc dropped right off, I started writing full original content posts to gain more traffic from search engines. I am starting to get more and more traffic now from Google etc, but there's still lots to improve.
I am concerned that the shortest/weakest posts on the website are holding things back to some degree. I am considering going through the website and deleting the very weakest older posts based on their quality/backlinks and PA. This will probably run into 100s of posts. Is it detrimental to delete so weak many posts from a website?
Any and all advice on how to proceed would be greatly recieved.
-
This is a very valid question, in my opinion, and one that I have thought about a lot. I even did it on a site before on a UGC section where there were about 30k empty questions, many of which were a reputation nightmare for the site. We used the parameters of:
- Over a year old
- Has not received an organic visit in the past year
We 410d all of them as they did not have any inbound links and we just wanted them out of the index. I believe they were later 301d, and that section of the site has now been killed off.
Directly after the pages were removed, we saw a lift of ~20% in organic traffic to that section of the site. That maintained, and over time that section of the site started getting more visits from organic as well.
I saw it as a win and went through with it because:
- They were low quality
- They already didn't receive traffic
- By removing them, we'd get more pages that we wanted crawled, crawled.
I think Gary's answer of "create more high quality content" is too simplistic. Yes, keep moving forward in the direction you are, but if you have the time or can hire someone else to do it, and those pages are not getting traffic, then I'd say remove them. If they are getting traffic, maybe do a test of going back and making them high quality to see if they drive more traffic.
Good luck!
-
Too many people are going to gloss over the "In general" part of what Gary is saying.
Things not addressed in that thread:
- If a URL isn't performing for you but has a few good backlinks, you're probably still better off to 301 the page to better content to it lend additional strength.
- The value of consistency across the site; wildly uneven content can undermine your brand.
- Consolidating information to provide a single authoritative page rather than multiple thin and weak pages.
- The pointlessness of keeping non-performing pages when you don't have the resources to maintain them.
-
Haha I read this question earlier, saw the post come across feedly and knew what I needed to do with it. Just a matter of minutes.
You're right though - I would've probably said remove earlier as well. It's a toss up but usually when they clarify, I try to follow. (Sometimes they talk nonsense of course, but you just have to filter that out.)
-
Just pipped me to it
-
Hi Xpers.
I was reading a very timely, if not the same issue article today from Barry Schwartz over at SEO Round Table. He has been following a conversation from Gary Illyes at Google, whom apparently does not recommend removing content from a site to help you recover from a Panda issue, but rather recommends increasing the number of higher quality pages etc.
If you are continuing to get more traffic by adding your new larger higher quality articles, I would simply continue in the same vein. There is no reason why you cannot still continue to share your content on social platforms too.
In the past I may have suggested removing some thin/outsdated content and repointing to a newer more relevant piece, but in light of this article I now may start to think a tad differently. Hopefully some of the other Mozzers might have more thoughts on Barry's post too.
Here is the article fresh off the press today - https://www.seroundtable.com/google-panda-fix-content-21006.html
-
Google's Gary Illyes basically just answered this on Twitter: https://www.seroundtable.com/google-panda-fix-content-21006.html
"We don't recommend removing content in general for Panda, rather add more highQ stuff"
So rather than spend a lot of time on old work, move forward and improve. If there's terrible stuff, I'd of course remove it. But if it's just not super-high quality, I would do as Gary says in this instance and work on new things.
Truthfully, getting Google to recrawl year or two or five stuff can be tough. If they don't recrawl it you don't even get the benefit until they do, if there were a benefit. Moving forward seems to make more sense to me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My website is penalized from google with no message in GWT.
On 26 of October 2018 My website have around 1 million pages indexed on google. but after hour when I checked my website was banned from google and all pages were removed. I checked my GWT and I did not receive any message. Can any one tell me what are the possible reasons and how can I recover my website? My website link is https://www.whoseno.com
Intermediate & Advanced SEO | | WhoseNo0 -
How to Target Country Specific Website Traffic?
I have a website with .com domain but I need to generate traffic from UK? I have already set my GEO Targeting location as UK in Google Webmasters & set country location as UK in Google Analytics as well but still, i get traffic only from India. I have also set Geo-targeting code at the backend of the website. But nothing seems works. Can anyone help me how can is do this? I am unable to understand what else can be done.
Intermediate & Advanced SEO | | seoninj0 -
How to integrate two websites, post-merger?
One of my clients has just been bought by a much larger company and thus will be losing their website and brand name. My client's site has built up a lot of traffic and authority in its space, so we are very nervous about losing all of this after the sale has gone through. The purchasing company intends for my client's services to be represented on its own website, so I am wondering, from a technical standpoint, what the best way is of going ahead with this, since my client will continue to work with the new company and would like to keep us onboard. Should we doing an 80/20 analysis, recreate our most valuable pages (eg. 70%+ of traffic is to home page) on the new site, then 301 each of these pages individually to its equivalent on the new site, while retaining as much of the old pages' on-page content/structure as possible? One thing I am concerned about is the fact that a large chunk of traffic is from brand searches. Again, should we simply recreate the home page with a page title of e.g. "X company is now part of Y company" in order that we'll still rank highly for the old company's brand name? Any advice on how to go about this is much appreciated.
Intermediate & Advanced SEO | | zakkyg0 -
How ot optimise a website for competitive keywords?
Hi guys, I hope to find some good answers to my questions, because here are some of the best SEO's in the world. I'm doing SEO as a hobby for a few years and had some very good results before the latest Google updates. Now I'm not able to rank any website for competitive keywords. The last project I started is this website (man and van hire company targeting London market).
Intermediate & Advanced SEO | | nasi_bg
The problem is that I can't rank even in Top 100 in Google UK for the main keywords like: "man and van london" , "man and van service london" ,"london man & van"...
The site has over 1k good backlinks (according to Ahrefs), unique content, titles and descriptions but still can't rank well. Am i missing something? Few years back that was more than enough to rank well in Google.
I will be very grateful to hear your suggestions and opinions.0 -
Should you delete old blog posts for SEO purposes?
Hey all, When I run crawl diagnostics I get around 500 medium-priority issues. The majority of these (95%) come from issues with blog pages (duplicate titles, missing meta desc, etc.). Many of these pages are posts listing contest winners and/or generic announcements (like, "we'll be out of the office tomorrow"). I have gone through and started to fix these, but as I was doing so I had the thought: what is the point of updating pages that are completely worthless to new members (like a page listing winners in 2011, in which case I just slap a date into the title)? My question is: Should I just bite the bullet and fix all of these or should delete the ones that are no longer relevant? Thanks in advance, Roman
Intermediate & Advanced SEO | | Dynata_panel_marketing1 -
Article Marketing / Article Posting
I am working on the SEO on a few different websites and I have built out an article marketing campaign so that I can get high quality backlinks for my website. I have been writing the content myself and I have been manually building out the top Web 2.0, Article Directory, and Doc Sharing sites. today I was creating an account on squidoo and I wondered if it mattered if I had the username be one of two things: my keyword as a user name, like: [keyword+geotag] example: roofinghouston just my first and last name as the username (or just a username I always use) (The reason behind #1 would be to have the optimized keyword and location I am trying to rank for, inside of the username. The reason for #2 would be that I don't want to get into trouble by having "too much" optimization.) I know a bit about optimization and that getting your keyword out there is great in a lot of areas, but I am not sure if it looks "suspicious" if I have my username be the keyword+geotag. I am just worried that all of this hard work will be torn down if I look like I'm trying too hard to be optimized, etc etc. There is no one answer, I am mainly looking for shared experiences. If you do have a definite answer, then I would like that too 🙂 Thanks SEOMoz!
Intermediate & Advanced SEO | | SEOWizards0 -
Website stuck on the second page
Hi there Can you please help me. I did some link building and worked with website last couple of months and rank got better but all keywords are on the second page, some of them are 11th and 12th. Is there anything I did wrong and google dont allow the website on the first page? Or should I just go on. It just looks strange keywords are on the second page for 2 weeks and not going to the first page for any single day. The website is quite old, around 10 years. Anyone knows what it is or where I can read about it?
Intermediate & Advanced SEO | | fleetway0 -
SeoMoz Crawler Shuts Down The Website Completely
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below) Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately. I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it. Here is what caused it from these error lines: 216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)" 216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
Intermediate & Advanced SEO | | Jury0