Significantly reducing number of pages (and overall content) on new site - is it a bad idea?
-
Hi Mozzers - I am looking at new site (not launched yet) - it contains significantly fewer pages than the previous site - 35 pages rather than 107 before - content on the remaining pages is plentiful but I am worried about the sudden loss of a significant "chunk" of the website - significantly cutting the size of a website must surely increase the risks of post-migration performance problems?
Further info - the site has run an SEO contract with a large SEO firm for several years. They don't appear to have done anything beyond tinkering with homepage content - all the header and description tags are the same across the current website. 90% of site traffic currently arrives on the homepage. Content quality/volume isn't bad across most of the current site.
Thanks in advance for your input!
-
Hi Luke
I wouldn't say keyword density is totally irrelevant, but what I mean by that is that you would expect to see on any page the keywords related to the subject of that page. But attempting to add keywords to a page to increase density to make it more indexable is not what you should be doing.
The focus of a page for semantic search needs to be the subject as a whole so content should be written for the whole in much the same way as you would write offline and include related content where relevant.
I'm not sure if there really is a safe percentage as such for keyword density, but suffice to say that the higher the percentage the more likely a page will be seen as spammy. I would have thought in most cases though <3% should be fine.
Peter
-
Hi Peter - sorry yes not that clear! I was asking about Keyword density I suppose - I know many SEOers suggest it's irrelevant, yet I spend much of my time removing penalties from sites and Keyword stuffing is causing issues.
If I see a penalty which I think is stuffing related I check densities and drop to 3% maximum - that appears to have reversed penalty a couple of times.
-
Hi Luke
No problem. You asked: How do you manage onsite keywords in content these days?
I am not clear what you are asking. Please can you clarify?
Peter
-
Thanks Peter for you useful input, as ever. How do you manage onsite keywords in content these days?
It's incredible how often the 301 redirect thing is overlooked by developers managing migrations - oh the number of times I've been called in after the developer has 301'd everything to the homepage (or not even bothered doing any redirects).
-
Hi Luke
For sure, carving away 2/3rds of your previous site is a big chunk, but I don't think that should overly concern you.
If you had said you were thinking of doing this a couple of years ago, I would have encouraged you to think again on the basis that the more pages your site had, the more weight it had, the more pages could be optimised and the more entry points there were from search.
With changes in recent months to Google search, in particular the move to semantic search and away from Boolean search, then having a keyword rich site, with many well optimised correct keyword density pages, shouldn't be the focus any more.
I'm not suggesting that having 35 pages compared to 107 pages is better. What I am saying is that it is better to have 35 sharply focused, high quality pages than 107 pages that don't have the same definition and focus. The measure should most definitely be quality over quantity, both on a page count basis and even on a word count basis.
What I would focus on with your 35 pages is making sure they are well structured (so many on-page SEO rules still apply - so make sure the faulty parts you mentioned are fixed) and the navigation is clear.
I am sure you know this, but make sure that your pages are customer-focused, so that they answer the type of questions your customers are asking in the language of your customer, and where related questions could occur, make sure there are good internal links between related content pages.
Finally, when you do the switch, I would just make sure that you think about your 301 redirects. Where an old page no longer exists on the new site, then redirect it to the closest related page.
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to identify number of internal links to page?
Hi Guys, Besides OSE & screaming frog - are there any tools which can check internal links to a page? I know ahrefs, majestic cannot. Cheers.
Intermediate & Advanced SEO | | wozniak650 -
How to 301 Redirect /page.php to /page, after a RewriteRule has already made /page.php accessible by /page (Getting errors)
A site has its URLs with php extensions, like this: example.com/page.php I used the following rewrite to remove the extension so that the page can now be accessed from example.com/page RewriteCond %{REQUEST_FILENAME}.php -f
Intermediate & Advanced SEO | | rcseo
RewriteRule ^(.*)$ $1.php [L] It works great. I can access it via the example.com/page URL. However, the problem is the page can still be accessed from example.com/page.php. Because I have external links going to the page, I want to 301 redirect example.com/page.php to example.com/page. I've tried this a couple of ways but I get redirect loops or 500 internal server errors. Is there a way to have both? Remove the extension and 301 the .php to no extension? By the way, if it matters, page.php is an actual file in the root directory (not created through another rewrite or URI routing). I'm hoping I can do this, and not just throw a example.com/page canonical tag on the page. Thanks!0 -
Any idea why this page isn't indexing?
Hi Mozzers, Question for all of you. Any idea why this page isn't indexing in Google? It's indexing in Bing, but we don't see it in Google's results. It doesn't seem like we have any noindex tags or anyway issues with the robots files either. Any ideas? http://ohva.k12.com/
Intermediate & Advanced SEO | | petertong230 -
Duplicate Content for Deep Pages
Hey guys, For deep, deep pages on a website, does duplicate content matter? The pages I'm talk about are image pages associated with products and will never rank in Google which doesn't concern me. What I'm interested to know though is whether the duplicate content would have an overall effect on the site as a whole? Thanks in advance Paul
Intermediate & Advanced SEO | | kevinliao1 -
Sitemap contains Meta NOINDEX pages - Good or bad?
Hi, Our sitemap is created by our e-commerce software - Magento - We are probably going to make a lot of products Meta No Index for the moment, until all the content has been corrected on them - but by default, as they are enabled, they will appear in Sitemap. So, the question is: "Should pages that are Meta NOINDEX be listed in a sitemap"? Does it matter? thanks!
Intermediate & Advanced SEO | | bjs20100 -
Site less than 20 pages shows 1,400+ pages when crawled
Hello! I’m new to SEO, and have been soaking up as much as I can. I really love it, and feel like it could be a great fit for me – I love the challenge of figuring out the SEO puzzle, plus I have a copywriting/PR background, so I feel like that would be perfect for helping businesses get a great jump on their online competition. In fact, I was so excited about my newfound love of SEO that I offered to help a friend who owns a small business on his site. Once I started, though, I found myself hopelessly confused. The problem comes when I crawl the site. It was designed in Wordpress, and is really not very big (part of my goal in working with him was to help him get some great content added!) Even though there are only 11 pages – and 6 posts – for the entire site, when I use Screaming Frog to crawl it, it sees HUNDREDS of pages. It stops at 500, because that is the limit for their free version. In the campaign I started here at SEOmoz, and it says over 1,400 pages have been crawled…with something like 900 errors. Not good, right? So I've been trying to figure out the problem...when I look closer in Screaming Frog, I can see that some things are being repeated over and over. If I sort by the Title, the URLs look like they’re stuck in a loop somehow - one line will have /blog/category/postname…the next line will have /blog/category/category/postname…and the next line will have /blog/category/category/category/postname…and so on, with another /category/ added each time. So, with that, I have two questions Does anyone know what the problem is, and how to fix it? Do professional SEO people troubleshoot this kind of stuff all of the time? Is this the best place to get answers to questions like that? And if not, where is? Thanks so much in advance for your help! I’ve enjoyed reading all of the posts that are available here so far, it seems like a really excellent and helpful community...I'm looking forward to the day when I can actually answer the questions!! 🙂
Intermediate & Advanced SEO | | K.Walters0 -
"site" operator and pages
Hi folks, We are having trouble in indexing, We have certain pages which are not coming in results when I am using the site operator in Google. for e.g. : sitename.com/widgets/red They are not showing any link results in Google webmaster tools too. But the pages which only linked through them are displaying in results when I am using site operator. for e.g: sitename.com/widgets/red/large We are redirecting some of the search which are close or exact match to the respective pages for e.g: sitename.com/search/red --> sitename.com/widgets/red We are fluctuating on rankings too in google serps form top ppositions to no where, for sitename.com/widgets/red and most of the times when google shows sitename.com/search/red instead of itename.com/widgets/red. Can you please put a light on this issues.
Intermediate & Advanced SEO | | semshah1430 -
New domain name for existing site
Hi all, Our business has aquired a new domain name because there are several organisations closely related to ours that use similar domain names to target a niche group of users. We would like to use this new domain name to link to an existing website with content targeted at this user group as we feel that they will be more comfortable getting to the content via this new URL. After a useful search in these forums the majority of SEOMOZ gurus suggest that the new URL should be redirected to our current site using a 301 and we are happy to do this. However do we have to link the URL to our homepage or is it acceptable to link to a subfolder within the domain and then targeting content on this page to the user niche? Thanks for any input. Kind regards. Edit 11:38 The old url is oldcommunity.charity.com (we know having a subdomain is bad) this is where we manage all community engagement. The new url is www.newparticularcommunity.com and we would redirect this to oldcommunity.charity.com. The reason we have bought www.oldparticularcommunity.com is because the url is used by other charities for community engagement and is recognised by the community we are targeting. We are redirecting to our old site because we do not want to engage with them on this new url as our old site oldcommunity.charity.com already does this and can cater for the new community and perhaps they haven't realised that we can.
Intermediate & Advanced SEO | | tgraham0