Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I delete 100s of weak posts from my website?
-
I run this website: http://knowledgeweighsnothing.com/
It was initially built to get traffic from Facebook. The vast majority of the 1300+ posts are shorter curation style posts. Basically I would find excellent sources of information and then do a short post highlighting the information and then link to the original source (and then post to FB and hey presto 1000s of visitors going through my website). Traffic was so amazing from FB at the time, that 'really stupidly' these posts were written with no regard for search engine rankings.
When Facebook reach etc dropped right off, I started writing full original content posts to gain more traffic from search engines. I am starting to get more and more traffic now from Google etc, but there's still lots to improve.
I am concerned that the shortest/weakest posts on the website are holding things back to some degree. I am considering going through the website and deleting the very weakest older posts based on their quality/backlinks and PA. This will probably run into 100s of posts. Is it detrimental to delete so weak many posts from a website?
Any and all advice on how to proceed would be greatly recieved.
-
This is a very valid question, in my opinion, and one that I have thought about a lot. I even did it on a site before on a UGC section where there were about 30k empty questions, many of which were a reputation nightmare for the site. We used the parameters of:
- Over a year old
- Has not received an organic visit in the past year
We 410d all of them as they did not have any inbound links and we just wanted them out of the index. I believe they were later 301d, and that section of the site has now been killed off.
Directly after the pages were removed, we saw a lift of ~20% in organic traffic to that section of the site. That maintained, and over time that section of the site started getting more visits from organic as well.
I saw it as a win and went through with it because:
- They were low quality
- They already didn't receive traffic
- By removing them, we'd get more pages that we wanted crawled, crawled.
I think Gary's answer of "create more high quality content" is too simplistic. Yes, keep moving forward in the direction you are, but if you have the time or can hire someone else to do it, and those pages are not getting traffic, then I'd say remove them. If they are getting traffic, maybe do a test of going back and making them high quality to see if they drive more traffic.
Good luck!
-
Too many people are going to gloss over the "In general" part of what Gary is saying.
Things not addressed in that thread:
- If a URL isn't performing for you but has a few good backlinks, you're probably still better off to 301 the page to better content to it lend additional strength.
- The value of consistency across the site; wildly uneven content can undermine your brand.
- Consolidating information to provide a single authoritative page rather than multiple thin and weak pages.
- The pointlessness of keeping non-performing pages when you don't have the resources to maintain them.
-
Haha I read this question earlier, saw the post come across feedly and knew what I needed to do with it. Just a matter of minutes. You're right though - I would've probably said remove earlier as well. It's a toss up but usually when they clarify, I try to follow. (Sometimes they talk nonsense of course, but you just have to filter that out.)
-
Just pipped me to it
-
Hi Xpers.
I was reading a very timely, if not the same issue article today from Barry Schwartz over at SEO Round Table. He has been following a conversation from Gary Illyes at Google, whom apparently does not recommend removing content from a site to help you recover from a Panda issue, but rather recommends increasing the number of higher quality pages etc.
If you are continuing to get more traffic by adding your new larger higher quality articles, I would simply continue in the same vein. There is no reason why you cannot still continue to share your content on social platforms too.
In the past I may have suggested removing some thin/outsdated content and repointing to a newer more relevant piece, but in light of this article I now may start to think a tad differently. Hopefully some of the other Mozzers might have more thoughts on Barry's post too.
Here is the article fresh off the press today - https://www.seroundtable.com/google-panda-fix-content-21006.html
-
Google's Gary Illyes basically just answered this on Twitter: https://www.seroundtable.com/google-panda-fix-content-21006.html
"We don't recommend removing content in general for Panda, rather add more highQ stuff"
So rather than spend a lot of time on old work, move forward and improve. If there's terrible stuff, I'd of course remove it. But if it's just not super-high quality, I would do as Gary says in this instance and work on new things.
Truthfully, getting Google to recrawl year or two or five stuff can be tough. If they don't recrawl it you don't even get the benefit until they do, if there were a benefit. Moving forward seems to make more sense to me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO on dynamic website
Hi. I am hoping you can advise. I have a client in one of my training groups and their site is a golf booking engine where all pages are dynamically created based on parameters used in their website search. They want to know what is the best thing to do for SEO. They have some landing pages that Google can see but there is only a small bit of text at the top and the rest of the page is dynamically created. I have advised that they should create landing pages for each of their locations and clubs and use canonicals to handle what Google indexes.Is this the right advice or should they noindex? Thanks S
Intermediate & Advanced SEO | | bedynamic0 -
Two divisions, same parent company, identical websites
A client of mine has intentionally built two websites with identical content; both companies sell the same product, one via an 80 year old local brand, well known. The other division is a national brand, new, and working to expand. The old and new divisions cannot be marketed as a single company for legal reasons. My life would be simple if the rules for distinguishing between nation's could apply, but I only have city X, and The U.S. I understand there is no penalty for duplicate content per se but I need to say to Google, "if searcher is in city X, serve content X. If not, serve content U.S. Both sites have atrocious DA and from what GA tells me, the National content appears to have never been served in a SERP in 3 years. I've been asked to improve visibility for both sites.
Intermediate & Advanced SEO | | kc_sunshines0 -
Two websites vs each other owned by same company
My client owns a brand and came to me with two ecommerce websites. One website sells his specific brand product and the other sells general products in his niche (including his branded product). Question is my client wants to rank each website for basically the same set of keywords. We have two choices I'd like feedback on- Choice 1 is to rank both websites for same keyword groupings so even if they are both on page 1 of the serps then they take up more real estate and share of voice. are there any negative possibilities here? Choice 2 is to recommend a shift in the position of the general industry website to bring it further away from the industry niche by focusing on different keywords so they don't compete with each other in the serps. I'm for choice 1, what about you?
Intermediate & Advanced SEO | | Rich_Coffman0 -
Website Redesign, 301 Redirects, and Link Juice
I want to change my client’s ecommerce site to Shopify. The only problem is that Shopify doesn’t let you customize domains. I plan to: keep each page’s content exactly the same keep the same domain name 301 redirect all of the pages to their new url The ONLY thing that will change is each page’s url. Again, each page will have the exact same content. The only source of traffic to this site is via Google organic search and sales depend on the traffic. There are about 10 pages that have excellent link juice, 20 pages that have medium link juice, and the rest is small link juice. Many of our links that have significant link juice are on message boards written by people that like our product. I plan to change these urls and 301 redirect them to their new urls. I’ve read tons of pages online about this topic. Some people that say it won’t effect link juice at all, some say it will might effect link juice temporarily, and others are uncertain. Most answers tend to be “You should be good. You might lose some traffic temporarily. You might want to switch some of your urls to the new structure to see how it affects it first.” Here’s my question: 1) Has anyone ever done changed a url structure for an existing website with link juice? What were your results and do you have a definitive answer on the topic? 2) How much link juice (if any) will be lost if I keep all of the exact content the same but only change each page’s url? 3) If link juice is temporarily lost and then regained, how long will it be temporarily lost? 1 week? 1 month? 6 months? Thanks.
Intermediate & Advanced SEO | | kirbyf0 -
Mobile website on a different URL address?
My client has an old eCommerce website that is ranking high in Google. The website is not responsive for mobile devices. The client wants to create a responsive design mobile version of the website and put it on a different URL address. There would be a link on the current page pointing to the external mobile website. Is this approach ok or not? The reason why the client does not want to change the design of the current website is because he does not have the budget to do so and there are a lot of pages that would need to be moved to the new design. Any advice would be appreciated.
Intermediate & Advanced SEO | | andypatalak0 -
Effects of having both http and https on my website
You are able to view our website as either http and https on all pages. For example: You can type "http://mywebsite.com/index.html" and the site will remain as http: as you navigate the site. You can also type "https://mywebsite.com/index.html" and the site will remain as https: as you navigate the site. My question is....if you can view the entire site using either http or https, is this being seen as duplicate content/pages? Does the same hold true with "www.mywebsite.com" and "mywebsite.com"? Thanks!
Intermediate & Advanced SEO | | rexjoec1 -
Website stuck on the second page
Hi there Can you please help me. I did some link building and worked with website last couple of months and rank got better but all keywords are on the second page, some of them are 11th and 12th. Is there anything I did wrong and google dont allow the website on the first page? Or should I just go on. It just looks strange keywords are on the second page for 2 weeks and not going to the first page for any single day. The website is quite old, around 10 years. Anyone knows what it is or where I can read about it?
Intermediate & Advanced SEO | | fleetway0 -
SeoMoz Crawler Shuts Down The Website Completely
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below) Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately. I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it. Here is what caused it from these error lines: 216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)" 216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
Intermediate & Advanced SEO | | Jury0