How much is too much content for a home-page?
-
Hey guys,
I'm looking to implement a strategy where I put a 20,000 word article on my home-page. It won't be a super-long page, this content will be divided into nested tabs.
The content will also be found on individual pages (corresponding to the tabs) on the site, but these will have a canonical tag pointing to the home page,
Will I get penalized for this kind of structure?
Cheers,
JC
-
Bing has just bloged about missused conanoicals will be ignored, I would not do what you suggest, if the algorithm does not pick it up, a manual look at your site would i think be seen as spam.
-
Hi JC
I was always taught that having 200-400 words on a page is what to aim for. But, in my quest over the years to test pages, I also found that an article page containing a 1000 words is OK. There was a test done some years ago, I can't remember by who, but this person tested the pages with the major search engines and out of the big 3 Bing was quite happy handling a 1000 word page but his stats showed differing results from Google and Yahoo. (If anyone can remember that test please comment as I am sure it was a Moz person??).
Another test I conducted was using tabs on a HOME page. I had in place about 7 tabs with content broken down into each tab. Prior to putting these in place the page only had 1 tab and around 300-400 words.
After adding in the other tabs, the page disappeared off page 1 SERPS for terms it had previously ranked for. At the time it was unknown what the issue was, so I had to go through a process to find out. The page had a reasonable number of inbound links pointing at it, so I did not increase those, and for about 8 to 10 weeks of tweaking and fine tuning I finally took the decision to revert the page back to what it had been i.e. 300-400 words of content and 1 tab. Wallah the page within 24 hours returned to ranking for the terms it had previously. My next experiment was to break my content down into 2 tabs and my page remained ranking.
After further tests I would conclude that too many tabs was the issue and in my opinion Google was penalising my page because it would look like I was 'hiding' text on the page in amongst these tabs.
I still believe Google likes big websites, Wikipedia is testimony to that, and look at how all the subject matter is broken down in that site. Therefore my suggestion to you would be to review your content, how many 'themes' and 'topics' are there within it, and break it down into specific pages of around 300-400 words with a few relevant links between the pages. I believe this will work much better for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Errors new website. How do you know which page to put the rel canonical tag on?
I am having problems with duplicate content. This is a new website and all the pages have the same page and domain rank, the following is an example of the homepage. How do you know which page to use the canonical tag on? http://medresourcesupply.com/index.php http://medresourcesupply.com/ Would this be the correct way to use this? Here is another example where Moz says these are duplicates. I can't figure out why because they have different url's and content. http://medresourcesupply.com/clutching_at_the_throat http://medresourcesupply.com/index.php?src=gendocs&ref=detailed_specfications &category=Main
Intermediate & Advanced SEO | | artscube.biz0 -
I have a lot of spammy links coming to my 404 page (the URLs have been removed now). Should i re-direct to Home?
I have a lot of spammy links pointing at my website according to MOZ. Thankfully all of them were for some URLs that we've long since removed so they're hitting my 404. Should i change the 404 with a 301 and Re-Direct that Juice to my home page or some other page or will that hurt my ranking?
Intermediate & Advanced SEO | | jagdecat0 -
Possible to Improve Domain Authority By Improving Content on Low Page Rank Pages?
My sites domain authority is only 23. The home page has a page authority of 32. My site consists of about 400 pages. The topic of the site is commercial real estate (I am a real estate broker). A number of the sites we compete against have a domain authority of 30-40. Would our overall domain authority improved if we re-wrote the content for several hundred of pages that had the lowest page authority (say 12-15)? Is the overall domain authority derived by an average of the page authority of each page on a domain? Alternatively could we increase domain authority by setting the pages with the lowest page authority to "no index". By the way our domain is www.nyc-officespace-leader.com Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Should I "NoIndex" Pages with Almost no Unique Content
I have a real estate site with MLS data (real estate listings shared across the Internet by Realtors, which means data exist across the Internet already). Important pages are the "MLS result pages" - the pages showing thumbnail pictures of all properties for sale in a given region or neighborhood. 1 MLS result page may be for a region and another for a neighborhood within the region:
Intermediate & Advanced SEO | | khi5
example.com/region-name and example.com/region-name/neighborhood-name
So all data on the neighborhood page will be 100% data from the region URL. Question: would it make sense to "NoIndex" such neighborhood page, since it would reduce nr of non-unique pages on my site and also reduce amount of data which could be seen as duplicate data? Will my region page have a good chance of ranking better if I "NoIndex" the neighborhood page? OR, is Google so advanced they know Realtors share MLS data and worst case simple give such pages very low value, but will NOT impact ranking of other pages on a website? I am aware I can work on making these MLS result pages more unique etc, but that isn't what my above question is about. thank you.0 -
Using unique content from "rel=canonical"ized page
Hey everyone, I have a question about the following scenario: Page 1: Text A, Text B, Text C Page 2 (rel=canonical to Page 1): Text A, Text B, Text C, Text D Much of the content on page 2 is "rel=canonical"ized to page 1 to signalize duplicate content. However, Page 2 also contains some unique text not found in Page 1. How safe is it to use the unique content from Page 2 on a new page (Page 3) if the intention is to rank Page 3? Does that make any sense? 🙂
Intermediate & Advanced SEO | | ipancake0 -
How should I exclude content?
I have category pages on an e-commerce site that are showing up as duplicate pages. On top of each page are register and login, and when selected they come up as category/login and category/register. I have 3 options to attempt to fix this and was wondering what you think is the best. 1. Use robots.txt to exclude. There are hundreds of categories so it could become large. 2. Use canonical tags. 3. Force Login and Register to go to their own page.
Intermediate & Advanced SEO | | EcommerceSite0 -
Amount of pages indexed for classified (number of pages for the same query)
I've notice that classified usually has a lots of pages indexed and that's because for each query/kw they index the first 100 results pages, normally they have 10 results per page. As an example imagine the site www.classified.com, for the query/kw "house for rent new york" there is the page www.classified.com/houses/house-for-rent-new-york and the "index" is set for the first 100 SERP pages, so www.classified.com/houses/house-for-rent-new-york www.classified.com/houses/house-for-rent-new-york-1 www.classified.com/houses/house-for-rent-new-york-2 ...and so on. Wouldn't it better to index only the 1st result page? I mean in the first 100 pages lots of ads are very similar so why should Google be happy by indexing lots of similar pages? Could Google penalyze this behaviour? What's your suggestions? Many tahnks in advance for your help.
Intermediate & Advanced SEO | | nuroa-2467120