Impact of simplifying website and removing 80% of site's content
-
We're thinking of simplifying our website which has grown to a very large size by removing all the content which hardly ever gets visited.
The plan is to remove this content / make changes over time in small chunks so that we can monitor the impact on SEO. My gut feeling is that this is okay if we make sure to redirect old pages and make sure that the pages we remove aren't getting any traffic. From my research online it seems that more content is not necessarily a good thing if that content is ineffective and that simplifying a site can improve conversions and usability.
Could I get people's thoughts on this please? Are there are risks that we should look out for or any alternatives to this approach? At the moment I'm struggling to combine the needs of SEO with making the website more effective.
-
I have to agree with you on making this move. Content that doesn't contribute to the quality of your site and receives minimal traffic should be removed. Besides ensuring the redirects are set properly, you can evaluate if these old content do actually make good material for future writing. It would be a waste to just delete them without any second thoughts. Some snippets of these old content can still prove useful and be spinned into new articles once you elaborate on them.
-
Great answers guys - thanks. It's good to know that my gut feeling was close to the mark!
-
Quality over quantity is definitely the order of the day, but before you drop some content completely, take a look at it and see if there is some useful info contained in it which could be consolidated into some of the content that you are actually retaining. Overall though a good content audit can be a good thing even if it means dropping some pages. Here's a useful article regarding content audits which is well worth taking a look at.
-
Sounds like a good idea to me. Make sure you have all the redirects in place to make sure when people want to visit the old content they're redirected to the new content. Also make sure you monitor the rest of your sites SEO traffic to make sure you don't fall in a hidden trap.
-
I think this pruning process makes sense. Although this will potentially decrease key words it will streamline the navigation for the content that is actually getting traffic. This will provide a better flow and potentially a lower bounce rate. Staging these cuts and monitoring the changes seems like a good way to manage your risk.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way of crawling my entire site to get a list of NoFollow links?
Hi all, hope somebody can help. I want to crawl my site to export an audit showing: All nofollow links (what links, from which pages) All external links broken down by follow/nofollow. I had thought Moz would do it, but that's not in Crawl info. So I thought Screaming Frog would do it, but unless I'm not looking in the right place, that only seems to provide this information if you manually click down each link and view "Inlinks" details. Surely this must be easy?! Hope someone can nudge me in the right direction... Thanks....
Intermediate & Advanced SEO | | rl_uk0 -
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Can I use the old website content on the new website, after deleting it from the server?
My website nowwhatstudio.com hit by google pure spam and google applied manual spam action to the website. I create new website (nowwhatmoments.com) with the same content from the old spam action website (nowwhatstudio.com). As google removed my old website content from search indexed. Can I use the same content for a new website? If I delete my old website from the server, after that Can I use the old website content for a new website? Or Can make edits the old website content and make it 80% original for a new website?
Intermediate & Advanced SEO | | bondhoward0 -
Is this ok for content on our site?
We run a printing company and as an example the grey box (at the bottom of the page) is what we have on each page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html We used to use this but tried to get most of the content on the page, but we now want to add a bit more in-depth information to each page. The question i have is - would a 1200 word document be ok in there and not look bad to Google.
Intermediate & Advanced SEO | | BobAnderson0 -
Why did this website disappear from Google's SERPs?
For the first several months this website, WEBSITE, ranked well in Google for several local search terms like, "Columbia MO spinal decompression" and "Columbia, MO car accident therapy." Recently the website has completely disappeared from Google's SEPRs. It does not even exist when I copy and paste full paragraphs into Google's search bar. The website still ranks fine in Bing and Yahoo, but something happened that caused it to be removed from Google. Beside for optimizing the meta data, adding headers, alt tags, and all of the typical on-page SEO stuff, we did create a guest post for a relevant, local blog. Here is the post: Guest Post. The post's content is 100% unique. I realize the post has way to many internal/external links, which we definitely did not recommend, but can anyone find a reason why this website was removed from Google's SERPs? And possibly how we should go about getting it back into Google's SERPs? Thanks in advance for any help.
Intermediate & Advanced SEO | | VentaMarketing0 -
Refocusing a site's conent
Here's a question I was asked recently, and I can really see going either way, but want to double check my preference. The site has been around for years and over that time expanded it's content to a variety of areas that are not really core to it's mission, income or themed content. These jettisonable other areas have a fair amount of built up authority but don't really contribute anything to the site's bottom line. The site is considering what to do with these off-theme pages and the two options seem to be: Leave them in place, but make them hard to find for users, thus preserving their authority as an inlink to other core pages. or... Just move on and 301 the pages to whatever is half-way relevant. The 301 the pages camp seems to believe that making the site's existing/remaining content focused on three or four narrower areas will have benefits for what Google sees the site as being about. So, instead of being about 12 different things that aren't too related to each other, the site will be about 3 or 4 things that are kinda related to eachother. Personally, I'm not eager to let go of old pages because they do produce some traffic and have some authority value to help the core pages via in-context and navigation links. On the other hand, maybe focusing more would have benefits search benefits. What do think? Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Wordpress.com content feeding into site's subdomain, who gets SEO credit?
I have a client who had created a Wordpress.com (not Wordpress.org) blog, and feeds blog posts into a subdomain blog.client-site.com. My understanding was that in terms of SEO, Wordpress.com would still get the credit for these posts, and not the client, but I'm seeing conflicting information. All of the posts are set with permalinks on the client's site, such as blog.client-site.com/name-of-post, and when I run a Google site:search query, all of those individual posts appear in the Google search listings for the client's domain. Also, I've run a marketing.grader.com report, and these same results are seen. Looking at the source code on the page, however, I see this information which leads me to believe the content is being credited to, and fed in from, Wordpress.com ('client name' altered for privacy): href="http://client-name.files.wordpress.com/2012/08/could_you_survive_a_computer_disaster.jpeg">class="alignleft size-thumbnail wp-image-2050" title="Could_you_survive_a_computer_disaster" src="http://client-name.files.wordpress.com/2012/08/could_you_survive_a_computer_disaster.jpeg?w=150&h=143" I'm looking to provide a recommendation to the client on whether they are ok to continue moving forward with this current setup, or whether we should port the blog posts over to a subfolder on their primary domain www.client-site.com/blog and use Wordpress.org functionality, for proper SEO. Any advice?? Thank you!
Intermediate & Advanced SEO | | grapevinemktg0 -
Bi-Lingual Site: Lack of Translated Content & Duplicate Content
One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David
Intermediate & Advanced SEO | | djreich0