Impact of simplifying website and removing 80% of site's content
-
We're thinking of simplifying our website which has grown to a very large size by removing all the content which hardly ever gets visited.
The plan is to remove this content / make changes over time in small chunks so that we can monitor the impact on SEO. My gut feeling is that this is okay if we make sure to redirect old pages and make sure that the pages we remove aren't getting any traffic. From my research online it seems that more content is not necessarily a good thing if that content is ineffective and that simplifying a site can improve conversions and usability.
Could I get people's thoughts on this please? Are there are risks that we should look out for or any alternatives to this approach? At the moment I'm struggling to combine the needs of SEO with making the website more effective.
-
I have to agree with you on making this move. Content that doesn't contribute to the quality of your site and receives minimal traffic should be removed. Besides ensuring the redirects are set properly, you can evaluate if these old content do actually make good material for future writing. It would be a waste to just delete them without any second thoughts. Some snippets of these old content can still prove useful and be spinned into new articles once you elaborate on them.
-
Great answers guys - thanks. It's good to know that my gut feeling was close to the mark!
-
Quality over quantity is definitely the order of the day, but before you drop some content completely, take a look at it and see if there is some useful info contained in it which could be consolidated into some of the content that you are actually retaining. Overall though a good content audit can be a good thing even if it means dropping some pages. Here's a useful article regarding content audits which is well worth taking a look at.
-
Sounds like a good idea to me. Make sure you have all the redirects in place to make sure when people want to visit the old content they're redirected to the new content. Also make sure you monitor the rest of your sites SEO traffic to make sure you don't fall in a hidden trap.
-
I think this pruning process makes sense. Although this will potentially decrease key words it will streamline the navigation for the content that is actually getting traffic. This will provide a better flow and potentially a lower bounce rate. Staging these cuts and monitoring the changes seems like a good way to manage your risk.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Beta Site Removal best practices
Hi everyone.
Intermediate & Advanced SEO | | bgvsiteadmin
We are doing a CMS migration and site redesign with some structural changes. Our temporarily Beta site (one of the staging environments and the only one that is not behind firewall) started appearing in search. Site got indexed before we added robots.txt due to dev error (at that time all pages were index,follow due to nature of beta site, it is a final stage that mirrors live site) As an remedy, we implemented robots.txt for beta version as : User-Agent: *
Disallow: / Removed beta form search for 90 days. Also, changed all pages to no index/no follow . Those blockers will be changed once code for beta get pushed into production. However, We already have all links redirected (301) from old site to new one. this will go in effect once migration starts (we will go live with completely redesigned site that is now in beta, in few days). After that, beta will be deleted completely and become 404 or 410. So the question is, should we delete beta site and simple make 404/410 without any redirects (site as is existed for only few days ). What is best thing to do, we don't want to hurt our SEO equity. Please let me know if you need more clarification. Thank you!0 -
Penalty for duplicate content on the same website?
Is it possible to get a penalty for duplicate content on the same website? I have a old custom-built site with a large number of filters that are pre-generated for speed. Basically the only difference is the meta title and H1 tag, with a few text differences here and there. Obviously I could no-follow all the filter links but it would take an enormous amount of work. The site is performing well in the search. I'm trying to decide whether if there is a risk of a penalty, if not I'm loath to do anything in case it causes other issues.
Intermediate & Advanced SEO | | seoman100 -
Print pages returning 404's
Print pages on one of our sister sites are returning 404's in our crawl but are visible when clicked on. Here is one example: https://www.theelementsofliving.com/recipe/citrus-energy-boosting-smoothie/print Any ideas as to why these are returning errors? Thank you!
Intermediate & Advanced SEO | | FirstService0 -
Any issue? Redirect 100's of domains into one website's internal pages
Hi all, Imagine if you will I was the owner of many domains, say 100 demographically rich kwd domains & my plan was to redirect these into one website - each into a different relevant subfolder. e.g. www.dewsburytilers..com > www.brandname.com/dewsbury/tilers.html www.hammersmith-tilers.com > www.brandname.com/hammersmith/tilers.html www.tilers-horsforth.com > www.brandname.com/horsforth/tilers.html another hundred or so 301 redirects...the backlinks to these domains were slim but relevant (the majority of the domains do not have any backlinks at all - can anyone see a problem with this practice? If so, what would your recommendations be?
Intermediate & Advanced SEO | | Fergclaw0 -
Scraped Content on Foreign Language Site. Big deal or not?
Hi All, I've been lurking and learning from this awesome Q&A forum, and I finally have a question. I am working on SEO for an entertainment site that tends to get scraped from time to time. Often, the scraped content is then translated into a foreign language, and posted along with whatever pictures were in the article. Sometimes a backlink to our site is given, sometimes not. Is scraped content that is translated to a foreign language still considered duplicate content? Should I just let it go, provided a backlink is given? Thanks!
Intermediate & Advanced SEO | | MKGraphiques
Jamie0 -
Hit by Penguin, Can I move the content from the old site to a new domain and start again with the same content which is high quality
I need some advice please. My website got the unnatural links detected message and was hit by penguin.. hard. Can I move the content from the current domain to a new domain and start again or does the content need to be redone also. I will obviously turn of the old domain once its moved. The other option is to try and identify the bad links and change my anchor profile which is a hit and miss task in my opinion. Would it not be easier just to identify the good links pointing to the old domain and get those changed to point to the new domain with better anchors. thanks Warren
Intermediate & Advanced SEO | | warren0071 -
OOPS!! My website links the most to me, I can't get it??
Today, I have checked Google webmaster tools to get answer of following question. Who links the most to my website? I was assumed that Google webmaster tools provide me list of external website where I have created my text links. But, I can't get it when see my own website links the most to me. (4652??) I checked my other websites which are integrated in Google webmaster tools. They also developed on same platform as well as same internal linking structure. But, I am not able to find out similar issue over there. That's why I am quite confuse with Vista Store. How can I solve it? Does it really matter? "Open Site Explorer is my favorite one and always using that to get it done. But, Google webmaster tools is also active & free so why should I not jump in to... 🙂 "
Intermediate & Advanced SEO | | CommercePundit0 -
Need to duplicate the index for Google in a way that's correct
Usually duplicated content is a brief to fix. I find myself in a little predicament: I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal. The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads. So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines. I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path. So the question is: Have someone else stumbled upon a similar problem? If so...? How did you fix it.
Intermediate & Advanced SEO | | Gustav-Northclick0