Best way to remove worthless/thin content?
-
I have a Wordpress site with about 3,000 pages and 1,000 of those are no value/duplicate content and drive no traffic. They are blog posts each with a single image and permalinks like example.com/post1, example.com/post2 etc.
I've started by deleting pages and 301 redirecting to relevant pages that actually have content.
-
Is deleting and 301 redirecting the best route?
-
Is 1,000 to many 301 redirects?
-
Should I just delete the pages that aren't really relevant to anything else?
-
Anything else I should know about deleting all of these pages?
Any help would be great!
-
-
As Alan said, if the pages don't have any inbound links then you're not going to lose any link equity by removing them. If the pages in question are not getting direct traffic or search traffic and don't have any links then I'd just delete them.
Otherwise a 301 redirect to the most relevant content is the best way to go.
Do a check for any internal links to these pages too. Your own links should be easy to fix. Remove them or point them at the better relevant content.
It might also be worth reviewing your 404 page. This can often be forgotten, but it can do a lot of the hard work guiding visitors to the content they're looking for.
-
It depends how you do the 301 redirects.
I have 25,000 301s and they are not in .htaccess, otherwise it would be slow.
If nobody is linking to the ones you want to delete, and search engines send no traffic to them, just delete them and don't bother to redirect.
-
You may find that the yoast SEO plugin can help you. There are options to noindex tags pages and attachment pages, etc.
-
I know that the number of 301-redirects can sometimes raise concerns about site speed. However, 1,000 is really not that many. Large e-commerce sites can have 10,000 301s, or even more. Depending on how the 301 files is handled even this large number can be handled in ways that have very little impact on site speed or page load times.
I think the number in your vase is nothing to worry about.
I do think that deleting and 301 redirects is the best way to go for anything that's remotely relevant. For those that aren't, create a custom 404 page and let them 404. Eventually they will drop out of the index. If they don't, you could still file a remove URL request in Google Webmaster Tools.
Hope this helps and good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should i switch from .com to .fr / .de etc
Hi, I have a seo question that I just cant seem to get an answer for. Right now I have an international ecommerce shop selling to several european countries.
Search Behavior | | Internet360
The setup is like this:
www.domain.com -> 301 -> www.domain.com/nl (dutch)
www.domain.com/uk (english)
www.domain.com/fr (french)
www.domain.com/de (german) I also have the domainnames for the countries I sell to:
www.domain.nl / www.domain.fr / www.domain.co.uk / www.domain.fr Its quite easy to switch to the later because I allready own the domains and can technically change this within an hour in our magento2 multistore. I think the customers would trust their local tld more than the .com and my question is: Is it also better for SEO?
Or will the splitting of the backlinks hurt me. Thanks for your insights.0 -
How does Google treat significant content changes to web pages and how should I flag them as such?
I have several pages (~30) that I have plans to overhaul. The URLs will be identical and the theme of the content will be the same (still talking about the same widgets, using the same language) but I will be adding a lot more useful information for users, specifically including things that I think will help with my fairly high bounce rate on these pages. I believe the changes will be significant enough for Google to notice, I was wondering if it goes "this is basically a new page now, I will treat it as such and rank accordingly" or does it go "well this content was rubbish last time I checked so it is probably still not great". My second question is, is there a way I can get Google to specifically crawl a page it already knows about with fresh eyes? I know in the Search Console I can ask Google to index new pages, and I've experimented with if I can ask it to crawl a page I know Google knows (it allows me to) but I couldn't see any evidence of it doing anything with that index. Some background The reason I'm doing this is because I noticed when these pages first ranked, they did very well (almost all first / second page for the terms I wanted). After about two weeks I've noticed them sliding down. It doesn't look like the competition is getting any better so my running theory is they ranked well to begin with because they are well linked internally and the content is good/relevant and one of the main things negatively impacting me (that google couldn't know at the time) is bounce rate.
Search Behavior | | tosbourn0 -
How & What is the best advice on Keyword Cannibalization & get onpage optimized perfectly?
Hi all mozzers, I am having confusion to understand the fact and importance to target a single or related grouped keywords which is quite broader in terms of relevancy being found within our business. Let's explain more in detail:
Search Behavior | | KammySEO
Suppose we have a website: abc.com deal businesses in "Party Supplies, Party Decorations" Where the term "Party Supplies" being used exact or randomly many places, please see below finalized Titles respective to each landing page: abc.com/birthday/
title - Birthday Party Supplies - Kids Birthday Party Decorations Ideas abc.com/wedding/
title - Wedding Favors - Wedding Party Decorations & Centerpieces abc.com/baking/
title - Buy Baking Supplies - Cupcake & Cake Decorating Supplies abc.com/occasions/
title - Special Occasions Parties Supplies & Events - Party Time My main concern is, do our keyword party supplies gets stuck with "Keyword Cannibalization" ? If yes then what is the best advice you folks like to input here in order to safeguard and optimize best our landing pages for the such broader related search terms within the businesses. I am looking for best answer here0 -
Displaying different site content to users who have already visited your site
I've seen and heard about the concept of treating repeat site visitors differently, displaying different content based on behavior etc. Not sure what it's called buy Hubspot seems to offer something like this with their platform. Anyone know of a third party app (Wordpress perhaps) or tool that does this? How does this even work? Thanks for the help!!
Search Behavior | | RickyShockley0 -
Has anyone found a best practice for ranking a client who has one central office location, but a large, regional service area that they want to rank for?
All the most recent local search ranking is based upon physical addresses, but what happens when someone provides services and has staff 100 miles from that central office location, but no physical address to work with?
Search Behavior | | BehrDesign0 -
Dupe Content: Canonicalize the Wordpress Tag or NoIndex?
Mozzers, Here we go. I've read multiple posts for years on taxonomy dupe content. In fact, I've read 10 articles tonight on taxonomies and categories. A little background: I am using Wordpress SEO with the Yoast plugin. **Here is the scenario: We have 560 tags - some make sense - some do not. ** What do I do? Do I not worry about it? Matt Cutts said twice that I should not stress about it, because in the worse non-spammy case, Google may just ignore the duplicate content. Matt said in the video, “I wouldn’t stress about this unless the content that you have duplicated is spammy or keyword stuffing.” (Found Via Search Engine Land - http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459). Do I NoIndex,Follow the Tags? Yoast and a Moz post both say I should NoIndex and Follow the Tags. From the post: "Tag, author, and date archives will all look too similar to other content. So it does not make sense to have them indexed." BUT! **The tags have been indexed for YEARS! And both articles go onto say **"if your blog has already existed for some time, and you've been indexing tags all along for example, you shouldn't just go deindexing them" (http://moz.com/blog/setup-wordpress-for-seo-success). So do I deindex tags that have been indexed for years? I checked the analytics, and in the past month, tags have brought in less than 1% of traffic, but they are bringing in traffic. Do I canonicalize the tags? Canonicalize the URL from "http://domain.com/blog/tag/addiction/" to "http://domain.com/blog/" ? And if I canonicalize, would you canonicalize to the /blog or to the base /tag? Thanks for any and all help. I just want to clarify this issue. One of the reasons is because I received a Moz Report with a TON of dupe content warning from the tags and categories.
Search Behavior | | Thriveworks-Counseling2 -
Will creating sub folders for foreign versions of the website, remove rank juice from the main site?
Hi, We plan to create sub-folders for our german, spanish and italian language versions. If we do will this pass rank juice to the new sites, but at the same time badly effect our main site, regards link juice/serp? Or is this not such a huge factor in people's experience? Many thanks. James.
Search Behavior | | Quime1 -
Geo-targeting / Presenting Unique Content
A client is debating housing two websites under one URL. The sites would offer similar services at different price points. For example, if a user was coming from a San Fran IP they would be presented with the "high-end" packages while another user coming from Dallas would get the "low budget" content. What are the SEO implications? I know that auto geo-targeting can sometimes be risky. It seems like IP locators aren't accurate all the times (especially from a mobile device). Advise? Basically, the client wants to make sure that a Dallas user will be presented with the "right" keywords in the SERPs. What would you recommend? Thanks!
Search Behavior | | lhc670