Significantly reducing number of pages (and overall content) on new site - is it a bad idea?
-
Hi Mozzers - I am looking at new site (not launched yet) - it contains significantly fewer pages than the previous site - 35 pages rather than 107 before - content on the remaining pages is plentiful but I am worried about the sudden loss of a significant "chunk" of the website - significantly cutting the size of a website must surely increase the risks of post-migration performance problems?
Further info - the site has run an SEO contract with a large SEO firm for several years. They don't appear to have done anything beyond tinkering with homepage content - all the header and description tags are the same across the current website. 90% of site traffic currently arrives on the homepage. Content quality/volume isn't bad across most of the current site.
Thanks in advance for your input!
-
Hi Luke
I wouldn't say keyword density is totally irrelevant, but what I mean by that is that you would expect to see on any page the keywords related to the subject of that page. But attempting to add keywords to a page to increase density to make it more indexable is not what you should be doing.
The focus of a page for semantic search needs to be the subject as a whole so content should be written for the whole in much the same way as you would write offline and include related content where relevant.
I'm not sure if there really is a safe percentage as such for keyword density, but suffice to say that the higher the percentage the more likely a page will be seen as spammy. I would have thought in most cases though <3% should be fine.
Peter
-
Hi Peter - sorry yes not that clear! I was asking about Keyword density I suppose - I know many SEOers suggest it's irrelevant, yet I spend much of my time removing penalties from sites and Keyword stuffing is causing issues.
If I see a penalty which I think is stuffing related I check densities and drop to 3% maximum - that appears to have reversed penalty a couple of times.
-
Hi Luke
No problem. You asked: How do you manage onsite keywords in content these days?
I am not clear what you are asking. Please can you clarify?
Peter
-
Thanks Peter for you useful input, as ever. How do you manage onsite keywords in content these days?
It's incredible how often the 301 redirect thing is overlooked by developers managing migrations - oh the number of times I've been called in after the developer has 301'd everything to the homepage (or not even bothered doing any redirects).
-
Hi Luke
For sure, carving away 2/3rds of your previous site is a big chunk, but I don't think that should overly concern you.
If you had said you were thinking of doing this a couple of years ago, I would have encouraged you to think again on the basis that the more pages your site had, the more weight it had, the more pages could be optimised and the more entry points there were from search.
With changes in recent months to Google search, in particular the move to semantic search and away from Boolean search, then having a keyword rich site, with many well optimised correct keyword density pages, shouldn't be the focus any more.
I'm not suggesting that having 35 pages compared to 107 pages is better. What I am saying is that it is better to have 35 sharply focused, high quality pages than 107 pages that don't have the same definition and focus. The measure should most definitely be quality over quantity, both on a page count basis and even on a word count basis.
What I would focus on with your 35 pages is making sure they are well structured (so many on-page SEO rules still apply - so make sure the faulty parts you mentioned are fixed) and the navigation is clear.
I am sure you know this, but make sure that your pages are customer-focused, so that they answer the type of questions your customers are asking in the language of your customer, and where related questions could occur, make sure there are good internal links between related content pages.
Finally, when you do the switch, I would just make sure that you think about your 301 redirects. Where an old page no longer exists on the new site, then redirect it to the closest related page.
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A single page from site not ranking
Hello, We have a new site launched in March, that is ranking well in search for all of the pages, except one and we don't know why. This page it is optimised exactly the same way like the others, but still doesn't rank in Google. We have verified robots.txt for noffollow, noindex tags, we have verified if it was penalized by Google, but still didn't find nothing. Initially we had another site and was on the topic of this page, but we have redirected it to the new one. In case this old site was anytime in the past penalized by Google, could it be possible that the new page be influenced by this? Also, we have another site that ranks on the first position, that targets the same keywords like the page that does not rank. It was the first site we launched, so it is pretty much old, but we do not have duplicate content on them. Maybe Google doesn't like the fact that both target the same keywords and chooses to display only the old site? Please help us if you have any ideas or have been through such thing. Thank you!
Intermediate & Advanced SEO | | daniela.pirlogea0 -
Removing massive number of no index follow page that are not crawled
Hi, We have stackable filters on some of our pages (ie: ?filter1=a&filter2=b&etc.). Those stacked filters pages are "noindex, follow". They were created in order to facilitate the indexation of the item listed in them. After analysing the logs we know that the search engines do not crawl those stacked filter pages. Does blocking those pages (by loading their link in AJAX for example) would help our crawl rate or not? In order words does removing links that are already not crawled help the crawl rate of the rest of our pages? My assumption here is that SE see those links but discard them because those pages are too deep in our architecture and by removing them we would help SE focus on the rest of our page. We don't want to waste our efforts removing those links if there will be no impact. Thanks
Intermediate & Advanced SEO | | Digitics0 -
Want to merge high ranking niche websites into a new mega site, but don't want to lose authority from old top level pages
I have a few older websites that SERP well, and I am considering merging some or all of them into a new related website that I will be launching regardless. My old websites display real estate listings and not much else. Each website is devoted to showing homes for sale in a specific neighborhood. The domains are all in the form of Neighborhood1CityHomes.com, Neighborhood2CityHomes.com, etc. These sites SERP well for searches like "Neighborhood1 City homes for sale" and also "Neighborhood1 City real estate" where some or all of the query is in the domain name. Google simply points to the top of the domain although each site has a few interior pages that are rarely used. There is next to zero backlinking to the old domains, but each links to the other with anchor text like "Neighborhood1 Cityname real estate". That's pretty much the extent of the link profile. The new website will be a more comprehensive search portal where many neighborhoods and cities can be searched. The domain name is a nonsense word .com not related to actual key words. The structure will be like newdomain.com/cityname/neighborhood-name/ where the neighborhood real estate listings are that would replace the old websites, and I'd 301 the old sites to the appropriate internal directories of the new site. The content on the old websites is all on the home page of each, at least the content for searches that matter to me and rank well, and I read an article suggesting that Google assigns additional authority for top level pages (can I link to that here?). I'd be 301-ing each old domain from a top level to a 3rd level interior page like www. newdomain/cityname/neighborhood1/. The new site is better than the old sites by a wide margin, especially on mobile, but I don't want to lose all my top positions for some tough phrases. I'm not running analytics on the old sites in question, but each of the old sites has extensive past history with AdWords (which I don't run any more). So in theory Google knows these old sites are good quality.
Intermediate & Advanced SEO | | Gogogomez0 -
New site - when will it rank?
We changed our domain 6 weeks ago as we had a penalty we couldn't shake off... My question is: How long will it take to rank for our keywords. I appreciate this is a difficult questions as there are a lot of factors that will effect our ranking. Do Google wait a period of time before allowing a new site to rank well?
Intermediate & Advanced SEO | | jj34340 -
Finding Duplicate Content Spanning more than one Site?
Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Page HTML great for humans, but seems to be very bad for bots?
We recently switched platforms and use Joomla for our website. Our product page underwent a huge transformation and it seems to be user friendly for a human, but when you look at one of our product pages in SEOBrowser it seems that we are doing a horrible job optimizing the page and our html almost makes us look spammy. Here is an example or a product page on our site: http://urbanitystudios.com/custom-invitations-and-announcements/shop-by-event/cocktail/beer-mug And, if you take a look in something like SEObrowser, it makes us look not so good. For example, all of our footer and header links show up. Our color picker is a bunch of pngs (over 60 to be exact), our tabs are the same (except for product description and reviews) on every single product page... In thinking about the bots: 1-How do we handle all of the links from footer, header and the same content in the tabs 2-How do we signal to them that all that is important on the page is the description of the product? 3-We installed schema for price and product image, etc but can we take it further? 4-How do we handle the "attribute" section (i.e. our color picker, our text input, etc). Any clarification I need to provide, please let me know.
Intermediate & Advanced SEO | | UrbanityStudios0 -
Original Site content was used for submission to article directories
I had a communication problem with my writer and she used original unspun content and posted it to Unique Article Wizard. So all UAW does is take each paragraph and mix them up. So I searched a sentence on my site where the content came from and got back a bunch of returns for that sentence. My site wasn't the first result returned. I"m wondering how bad that is going to be for me. The links from UAW are going back to an anchor layer that then links back to this site. Can anyone tell me if I need to rewrite the content on the original site? That is the only way I can think to make that not an issue. Thanks
Intermediate & Advanced SEO | | mtking.us_gmail.com0 -
How much (%) of the content of a page is considered too much duplication?
Google is not fond of duplication, I have been very kindly told. So how much would you suggest is too much?
Intermediate & Advanced SEO | | simonberenyi0