How much content is too much? Best Pages For Content?
-
To my understanding content has a lot to do with organic rankings if written correctly. My question is, how much content is too much and what pages are best to place content.
Our company sells very costly products. Our customers call to purchase, we do not have an eCommerce site. Write now we have on average 350 words per page. We have about 200+ pages. Each page is written for that general category and each product has its own unique content. It seems to me that the pages with less content, tend to rank a bit better.
As we are in the process of redoing our website, is there any recommendations on writing content, or adjusting the amount of text. I am thinking a lot of our text is informative only to a certain extent. Would writing content just for the main category page be better, and then on the actual product page, have only about 250 words as a description?
Are there any other recommendations for SEO that are fairly new? Besides the Title, Description, Heading Tags, Image Alts, URLS etc.
-
Just an example.... tell people how to use your products for maximum enjoyment...
For example if you sell mountain bikes articles about.... how to select mountain bike tires.... how to change a mountain bike tire in under 60 seconds... recommended tools for off-road biking... how to true a bike wheel (rear)... how to true a bike wheel (front).... how to clean the mountain bike chain.... geeezz... I could go on forever...
So... give everything that you need to make your customers experts at buying... owning... riding... maintaining... racing their mountain bikes... and make your staff look like expert mountain bikers and all-round good guys.
PS... I am not talking about light weight trivial pedestrian articles... these would be substantive, complete and detailed articles with lots of big juicy photos. And you need a video of me changing the mountain bike tire in 42 seconds.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing content and design of the website gonna affect my all the backlinks i have made till now
i have been working on my link profile for a month now, after learning about 5 step moz methodology i have decided that i would like to change all of the content of my site and taylor it to what my customers need, am i gonna loose all the domain authority if make changes? if it gonna affect, hows that gonna come out
Web Design | | calvinkj0 -
Doing SEO for single page applications / Prerender.io
My dev and I are migrating an existing multi page application to a single page application with prerender.io. Does anybody have any experience with doing SEO for single page applications? Any other consequences we should take into account? Anything important to expect. Any insights would be 10/10 appreciated.
Web Design | | Edward_Sturm0 -
Is there a best practice for using a general iso code for the EAME region and APAC region or should you break it out by country?
I am creating a strategy for multiple regions and the US comes to market different than EAME (Europe, Africa, Middle East) and China. We were planning on using language and iso codes in subfolder's but the corporation only wants their content to be in German, English, and Queens English. Our current decision is to use /en-US/, /en/, /de/, /en-CN/, /zh-CN/. /en/ and /de/ will be what we use for EAME. This doesn't seem like the best idea as I think /en/ will get indexed as the US version and not the EAME version. Any suggestion or if clarification is needed is greatly appreciated.
Web Design | | GodfreyB2B0 -
Question #1: Does Google index https:// pages? I thought they didn't because....
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored) My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one. The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/ instead of **http://**www.example.com/example-page/ To double check that this was causing a loss in Link Juice. I jumped over to OSE. Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed. So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed... Right?? Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed. The problem is.. is this a volusion problem? Should I switch to Wordpress? here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress) http://www.uncommonthread.com/
Web Design | | TylerAbernethy0 -
Getting a lot more duplicate content warnings than I expected.
I run WordPress on many of my sites and a site crawl has found MANY duplicate content pages on the latest domain I started a campaign for. I expected to see quite a lot on the tag pages that only had one post but even tag pages with multiple posts and author and category pages with many posts are showing as duplicate content. Is this normal for a WordPress site to have so much duplicate content warnings from the taxonomy pages? I have the option to bulk noindex, follow the category and tag pages but should I do it? I get some traffic directly to the tag pages so removing the pages from search results would dent the traffic of the site a little (generally high bounce rate, low engagement traffic anyway) but could removing the apparent duplicate content actually improve the article pages themselves? Or does anyone have any WordPress specific advice for making the pages not duplicate content? I've toyed with the idea of just displaying excerpts but creating manual excerpts for the 4 years worth of posts, some of which I have no personal knowledge of the subject matter so other suggestions are welcome.
Web Design | | williampatton0 -
Using content from other sites without duplicate content penalties?
Hi there, I am setting up a website, where i believe it would substantially benefit users experience if i setup a database of information on artists. I am torn because to feasibly do this correctly, i would have content that is built from multiple sources, but has no real unique content. It would have parts from Wikipedia, parts from other websites etc. All would be sourced of-course. My concern is that if i do this, am i risking in devaluing my website because of this. Is there a way i can handle this without taking a hit?
Web Design | | BorisD0 -
What are some of the best website hosting platforms for Wordpress?
I'm looking for a new hosting provider and have been told to find one that specializes in Wordpress hosting because of higher page speed load times, etc.. Can anyone recommend a couple of Wordpress Hosting providers
Web Design | | webestate0 -
Are slimmed down mobile versions of a canonical page considered cloaking?
We are developing our mobile site right now and we are using a user agent sniffer to figure out what kind of device the visitor is using. Once the server knows whether it is a desktop or mobile browser it will deliver the appropriate template. We decided to use the same URL for both versions of the page rather than using m.websiteurl.com or www.websiteurl.mobi so that traffic to either version of these pages would register as a visit to the page. Will search engines consider this cloaking or is mobile "versioning" an acceptable practice? The pages in essence are the same, the mobile version will just leave out extraneous scripts and unnecessary resources to better display on a mobile device.
Web Design | | TahoeMountain400