Significantly reducing number of pages (and overall content) on new site - is it a bad idea?
-
Hi Mozzers - I am looking at new site (not launched yet) - it contains significantly fewer pages than the previous site - 35 pages rather than 107 before - content on the remaining pages is plentiful but I am worried about the sudden loss of a significant "chunk" of the website - significantly cutting the size of a website must surely increase the risks of post-migration performance problems?
Further info - the site has run an SEO contract with a large SEO firm for several years. They don't appear to have done anything beyond tinkering with homepage content - all the header and description tags are the same across the current website. 90% of site traffic currently arrives on the homepage. Content quality/volume isn't bad across most of the current site.
Thanks in advance for your input!
-
Hi Luke
I wouldn't say keyword density is totally irrelevant, but what I mean by that is that you would expect to see on any page the keywords related to the subject of that page. But attempting to add keywords to a page to increase density to make it more indexable is not what you should be doing.
The focus of a page for semantic search needs to be the subject as a whole so content should be written for the whole in much the same way as you would write offline and include related content where relevant.
I'm not sure if there really is a safe percentage as such for keyword density, but suffice to say that the higher the percentage the more likely a page will be seen as spammy. I would have thought in most cases though <3% should be fine.
Peter
-
Hi Peter - sorry yes not that clear! I was asking about Keyword density I suppose - I know many SEOers suggest it's irrelevant, yet I spend much of my time removing penalties from sites and Keyword stuffing is causing issues.
If I see a penalty which I think is stuffing related I check densities and drop to 3% maximum - that appears to have reversed penalty a couple of times.
-
Hi Luke
No problem. You asked: How do you manage onsite keywords in content these days?
I am not clear what you are asking. Please can you clarify?
Peter
-
Thanks Peter for you useful input, as ever. How do you manage onsite keywords in content these days?
It's incredible how often the 301 redirect thing is overlooked by developers managing migrations - oh the number of times I've been called in after the developer has 301'd everything to the homepage (or not even bothered doing any redirects).
-
Hi Luke
For sure, carving away 2/3rds of your previous site is a big chunk, but I don't think that should overly concern you.
If you had said you were thinking of doing this a couple of years ago, I would have encouraged you to think again on the basis that the more pages your site had, the more weight it had, the more pages could be optimised and the more entry points there were from search.
With changes in recent months to Google search, in particular the move to semantic search and away from Boolean search, then having a keyword rich site, with many well optimised correct keyword density pages, shouldn't be the focus any more.
I'm not suggesting that having 35 pages compared to 107 pages is better. What I am saying is that it is better to have 35 sharply focused, high quality pages than 107 pages that don't have the same definition and focus. The measure should most definitely be quality over quantity, both on a page count basis and even on a word count basis.
What I would focus on with your 35 pages is making sure they are well structured (so many on-page SEO rules still apply - so make sure the faulty parts you mentioned are fixed) and the navigation is clear.
I am sure you know this, but make sure that your pages are customer-focused, so that they answer the type of questions your customers are asking in the language of your customer, and where related questions could occur, make sure there are good internal links between related content pages.
Finally, when you do the switch, I would just make sure that you think about your 301 redirects. Where an old page no longer exists on the new site, then redirect it to the closest related page.
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting an Entire Site to a Page on Another Site?
So I have a site that I want to shut down http://vowrenewalsmaui.com and redirect to a dedicated Vow Renewals page I am making on this site here: https://simplemauiwedding.net. My main question is: I don't want to lose all the authority of the pages and if I just redirect the site using my domain registrar's 301 redirect it will only redirect the main URL not all of the supporting pages, to my knowledge. How do I not lose all the authority of the supporting pages and still shut down the site and close down my site builder? I know if I leave the site up I can redirect all of the individual pages to corresponding pages on the other site, but I want to be done with it. Just trying to figure out if there is a better way than I know of. The domain is hosted through GoDaddy.
Intermediate & Advanced SEO | | photoseo10 -
Moving site to new domain without access to redirect from old to new. How can I do this with as little loss to SERP results as possible?
I've been hired to build a new site for a customer. They were duped by some shady characters at goglupe.com (If you can reach them, tell them they are rats--phone is disconnected, address is a comedy club on Mission in SF). Glupe owns the domain name and would not transfer or give FTP access prior to dropping off the face of the earth. The customer doesn't want to chase after them with lawyers, so we are moving on. New domain, new site with much of the same content as previous site. All that I have access to is the old wordpress site. I plan to build the new site, then remove all pages/posts from the old site. Is there anything I can do to salvage the current page 1 ranking? Obviously, the new domain will take some time to get back there. Just hoping to avoid any pitfalls or penalties if I can. If I had complete access, I would follow all the standard guidelines. But I don't. Any thoughts? Thanks! Chris
Intermediate & Advanced SEO | | c_estep_tcbguy0 -
Duplicate content on product pages
Hi, We are considering the impact when you want to deliver content directly on the product pages. If the products were manufactured in a specific way and its the same process across 100 other products you might want to tell your readers about it. If you were to believe the product page was the best place to deliver this information for your readers then you could potentially be creating mass content duplication. Especially as the storytelling of the product could equate to 60% of the page content this could really flag as duplication. Our options would appear to be:1. Instead add the content as a link on each product page to one centralised URL and risk taking users away from the product page (not going to help with conversion rate or designers plans)2. Put the content behind some javascript which requires interaction hopefully deterring the search engine from crawling the content (doesn't fit the designers plans & users have to interact which is a big ask)3. Assign one product as a canonical and risk the other products not appearing in search for relevant searches4. Leave the copy as crawlable and risk being marked down or de-indexed for duplicated contentIts seems the search engines do not offer a way for us to serve this great content to our readers with out being at risk of going against guidelines or the search engines not being able to crawl it.How would you suggest a site should go about this for optimal results?
Intermediate & Advanced SEO | | FashionLux2 -
Why does old "Free" site ranks better than new "Optimized" site?
My client has a "free" site he set-up years ago - www.montclairbariatricsurgery.com (We'll call this the old site) that consistently outranks his current "optimized" (new) website - http://www.njbariatricsurgery.com/ The client doesn't want to get rid of his old site, which is now a competitor, because it ranks so much better. But he's invested so much in the new site with no results. A bit of background: We recently discovered the content on the new site was a direct copy of content on the old site. We had all copy on new site rewritten. This was back in April. The domain of the new site was changed on July 8th from www.Bariatrx.com to what you see now - www.njbariatricsurgery.com. Any insight you can provide would be greatly appreciated!!!
Intermediate & Advanced SEO | | WhatUpHud0 -
Does Google make continued attempts to crawl an old page one it has followed a 301 to the new page?
I am curious about this for a couple of reasons. We have all dealt with a site who switched platforms and didn't plan properly and now have 1,000's of crawl errors. Many of the developers I have talked to have stated very clearly that the HTacccess file should not be used for 1,000's of singe redirects. I figured If I only needed them in their temporarily it wouldn't be an issue. I am curious if once Google follows a 301 from an old page to a new page, will they stop crawling the old page?
Intermediate & Advanced SEO | | RossFruin0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Finding Duplicate Content Spanning more than one Site?
Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Reducing pages with canonical & redirects
We have a site that has a ridiculous number of pages. Its a directory of service providers that is organized by city and sub-category of the vertical. Each provider is on the main city page, then when you click on a category, it will only show those folks who offer that subcategory of this service. example: colorado/denver - main city page colorado/denver/subcat1 - subcategory page There are 37 subcategories. So, 38 pages that essentially have the same content - minus a provider or two - for each city. There are approx 40K locations in our database. So rough math puts us at 1.5 million results pages, with 97% of those pages being duplicate content! This is clearly a problem. But many of these obscure pages do rank and get traffic. A fair amount when you aggregate all these pages together. We are about to go through a redesign and want to consolidate pages so we can reduce the dupe content, get crawl budget allocated to more meaningful pages, etc. Here's what I'm thinking we should do with this site, and I would love to have your input: Canonicalize Before the redesign use the canonical tag on all the sub-category pages and push all the value from those pages (colorado/denver/subcat1, /subcat2, /subcat3... etc) to the main city page (colorado/denver/subcat1) 301 Redirect On the new site (we're moving to a new CMS) we don't publish the duplicate sub-category pages and do 301 redirects from the sub-category URLs to the main city page urls. We'd still have the sub-categories (keywords) on-page and use some Javascript filtering to narrow results. We could cut to the chase and just do the redirects, but would like to use canonicalization as a proof of concept internally at my company that getting rid of these pages is a good thing, or at least wont have a negative impact on traffic. i.e. by the time we are ready to relaunch traffic and value has been transfered to the /state/city page Trying to create the right plan and build my argument. Any feedback you have will help.
Intermediate & Advanced SEO | | trentc0