Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I delete older posts on my site that are lower quality?
-
Hey guys! Thanks in advance for thinking through this with me. You're appreciated!
I have 350 pieces of Cornerstone Content that has been a large focus of mine over the last couple years. They're incredibly important to my business. That said, less experienced me did what I thought was best by hiring a freelance writer to create extra content to interlink them and add relevancy to the overall site.
Looking back through everything, I am starting to realize that this extra content, which now makes up 1/3 my site, is at about 65%-70% quality AND only gets a total of about 250 visitors per month combined -- for all 384 articles. Rather than spending the next 9 months and investing in a higher quality content creator to revamp them, I am seeing the next best option to remove them.
From a pros perspective, do you guys think removing these 384 lower quality articles is my best option and focusing my efforts on a better UX, faster site, and continual upgrading of the 350 pieces of Cornerstone Content?
I'm honestly at a point where I am ready to cut my losses, admit my mistakes, and swear to publish nothing but gold moving forward. I'd love to hear how you would approach this situation!
Thanks
-
Hi Chris, thanks so much for the answer and thoughts on what you would do!
I totally hear what you're saying about the keyword stuffing. As I look back over it, it seems like it would make a great drinking game. Every time you read "Wyoming" you have to take a drink! (Would be a VERY short game haha)
Awesome. Based on your feedback, I'm going to go back through and make sure each article is:
-
Not keyword stuffed.
-
Interlinked effectively and organically.
-
Cut any crazy confusing wording.
Thanks again Chris. I sincerely appreciate you taking the time to look this over and give your honest option. You rock!
-
-
Wow, so sorry about the slow reply here, things have been crazy the last couple of weeks!
Looking at a few of your blogs I see what you mean. They're not too bad but are probably a bit too keyword-stuff to keep as they are.
Having the keyword amongst the content isn't a problem (obviously!) but when it starts to feel unnatural, that's when you start turning users away. As an example, I had a look at this post and found the word Wyoming used 17 times in a fairly short post.
Paragraphs like this one really highlight the awkwardness:
From the moment you validate a business idea, to processing your business licensing requirements, incorporating in Wyoming, to finding the right financing, it takes up time, money, and effort.
I also noticed in that post that the first link points to the page you're already on!
Internal linking is important and for the most part appears to have been implemented quite well. If it were my website I'd be leaving the posts up but systematically working my way back through them to remove some of the keyword stuffing and fixing up any weird linking to make them read better.
As much as cutting them all and starting again would be technically correct, in the real world we need to make compromises like this to maintain existing rankings and income.
-
Thanks for the input Chris, I appreciate you taking the time to respond!
You hit the nail on the head for them being 'just ok'. No spam keywords or crazy re-directs. I would say that the readability isn't great and you can actually see the entire list here.
Engagement is horrible. The pages are indexed by Google, but get almost no traffic. When they do get traffic, the time on site is less than about 30 seconds.
As a note: If you check out the internal inking inside the articles on that list, its actually that which holds me back from removing the pages. I feel like the internal linking strategy is pretty decent and it may be cool to keep them. I'm just not sure it's worth keeping them on solely for that reason.
-
This is a tough one and a bit of a gamble either way I suppose. If the content was absolute rubbish (maybe horrible spelling and grammar or keyword-spammed) then the suggestion would be obvious - delete them and move on.
Being that it sounds like they're "ok" but just not up to your modern standards, the decision isn't quite so simple. Having them on your site isn't going to make it any slower unless they're adding redirects or something else to your site, the issue is whether or not their low quality is hurting you and it's tough to say without seeing them.
Very generally speaking, if they're free of errors, don't spam keywords or talk about dodgy subjects like online casinos or pharmaceuticals then you're probably better off leaving them there since they will be passing some relevance signals and they are bringing you traffic.
The one other thing I'd suggest checking is user engagement on those pages. Since Google is looking at this too, having an average session duration of 4 seconds for a 2,000 word post is a pretty clear red flag that whatever that page is about isn't worthy of being in their search results.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Posting same content multiple blogs or multiple website - 2018
Submitting same content on multiple site or blog using original source Links. Its good or bad in term on Ranking and SEO. Can we post same content on multiple website with orginal post reference same like Press release site technique.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Do I lose link juice if I have a https site and someone links to me using http instead?
We have recently launched a https site which is getting some organic links some of which are using https and some are using http. Am I losing link juice on the ones linked using http even though I am redirecting or does Google view them the same way? As most people still use http naturally will it look strange to google if I contact anyone who has given us a link and ask them to change to https?
White Hat / Black Hat SEO | | Lisa-Devins0 -
Should I delete Meta Keywords from a website?
Hi Guys, I've been reading various posts on the Q&A section here at Moz about Meta keywords. I understand that meta keywords are not relevant with Google and that Bing signals this as spam. I'm optimising existing websites which already have meta keywords in the html coding. My question is: If I delete ALL meta keyword coding will this have any negative impact whatsoever? Thanks Mozers Jason 🙂
White Hat / Black Hat SEO | | Grant-Westfield0 -
Tags on WordPress Sites, Good or bad?
My main concern is about the entire tags strategy. The whole concept has really been first seen by myself on WordPress which seems to be bringing positive results to these sites and now there are even plugins that auto generate tags. Can someone detail more about the pros and cons of tags? I was under the impression that google does not want 1000's of pages auto generated just because of a simple tag keyword, and then show relevant content to that specific tag. Usually these are just like search results pages... how are tag pages beneficial? Is there something going on behind the scenes with wordpress tags that actually bring benefits to these wp blogs? Setting a custom coded tag feature on a custom site just seems to create numerous spammy pages. I understand these pages may be good from a user perspective, but what about from an SEO perspective and getting indexed and driving traffic... Indexed and driving traffic is my main concern here, so as a recap I'd like to understand the pros and cons about tags on wp vs custom coded sites, and the correct way to set these up for SEO purposes.
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
Closing down site and redirecting its traffic to another
OK - so we currently own two websites that are in the same industry. Site A is our main site which hosts real estate listings and rentals in Canada and the US. Site B hosts rentals in Canada only. We are shutting down site B to concentrate solely on Site A, and will be looking to redirect all traffic from Site B to Site A, ie. user lands on Toronto Rentals page on Site B, we're looking to forward them off to Toronto Rentals page on Site A, and so on. Site A has all the same locations and property types as Site B. On to the question: We are trying to figure out the best method of doing this that will appease both users and the Google machine. Here's what we've come up with (2 options): When user hits Site B via Google/bookmark/whatever, do we: 1. Automatically/instantly (301) redirect them to the applicable page on Site A? 2. Present them with a splash page of sorts ("This page has been moved to Site A. Please click the following link <insert anchor="" text="" rich="" url="" here="">to visit the new page.").</insert> We're worried that option #1 might confuse some users and are not sure how crawlers might react to thousands of instant redirects like that. Option #2 would be most beneficial to the end-user (we're thinking) as they're being notified, on page, of what's going on. Crawlers would still be able to follow the URL that is presented within the splash write-up. Thoughts? We've never done this before. It's basically like one site acquiring another site; however, in this case, we already owned both sites. We just don't have time to take care of Site B any longer due to the massive growth of Site A. Thanks for any/all help. Marc
White Hat / Black Hat SEO | | THB0