Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Creating New Pages Versus Improving Existing Pages
-
What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)?
-
Should I create new pages to cover related topics or improve an existing page?
My first consideration would be specificity of your topic. How many key words or phrases are you trying to focus? How much content do you have to offer?
Let's use cough syrup as an example (as I reach for my bottle). If your company name is Nature's Relief and you offer Nature's Cough Syrup as your product then one well presented page would probably be best.
If you are Robitussin and have 5 different cough syrups and brand yourself on "a different syrup for different coughs" then I would definitely recommend a separate page for each product. The first page might target keywords such as "hacking cough" which the next page might work along the lines of cough and nasal decongestant.
A final thought. If you provide Nature's Cough Syrup and are trying to compete with a competitor like Robitussin, then I would try to be creative and offer separate pages focusing on my competitor's key words. You can offer testimonials or examples where your product relieved a hacking cough, targeting the same key word.
In summary, step back and determine what your goals are for the page. First and foremost, how can you present the page to provide the best user experience. The next thought should be why are you making a change?
-
I'd like to offer a hybrid perspective. Quality doesn't actually always win in the end. If you've got a great quality filled page that brings no traffic because it can't compete in it's specific niche, it's sometimes due to the fact that competitors have much more quality content - they're established leaders in a given topic for example. And while more inbound links can sometimes help, or lately social media, sometimes it just requires more content.
Whether it's on-page or additional pages will require evaluation and Magento's suggestion is a good start. But also look at whether the competition is drowning you out for a given page's topic. And if they are doing it with just one page, you could try and go for head to head one-page battling, though you'd most likely be able to leap-frog ahead with a multiple-page approach where the sum-total is more than a competitor's single page. You'd essentially be creating a new "section" devoted to the topic.
Of course that doesn't mean you can scrap the quality issue because Chris's take does have a foundation in truth.
-
Run your webpage on the On Page Report Card. http://pro.seomoz.org/tools/on-page-keyword-optimization/new It will grade your webpage. Only do this for the web pages that are ranking in the top 50 (or whatever you determine) and decide which ones to improve. It sounds like some of the webpages you have may have some potential with just a little tweaking.
-
Quality over quantity always wins in the end. Make what you have the best you can, then add more quality content on related topics.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
How to deal with filter pages - Shopify
Hi there, /collections/living-room-furniture/black
On-Page Optimization | | williamhuynh
/collections/living-room-furniture/fabric Is that ok to make all the above filter pages canonicalised with their main category /collections/living-room-furniture Also, does it needs to be noindex, follow as well? Note - already removed the content from filter pages, updated meta tags as well. Please advice, thank you1 -
Is it better to keep a glossary or terms on one page or break it up into multiple pages?
We have a very large glossary of over 1000 industry terms on our site with links to reference material, embedded video, etc. Is it better for SEO purposes to keep this on one page or should we break it up into multiple pages, a different page for each letter for example? Thanks.
On-Page Optimization | | KenW0 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
New site pages are indexed but not ranking for anything
I just built this site for a client http://primedraftarchitecture.com. It went live 3 weeks ago and the pages are getting indexed as per Webmaster Tools. But I'm not seeing it rank for anything. We're adding blog articles regularly and used Moz Local for local links and have been building links in other local directories (probably about 15 so far). Usually I get some rankings, although very low, after just a week or two for new sites. Does anyone see anything glaring that may be causing a problem?
On-Page Optimization | | DonaldS1 -
Different page for each product colour?
Hi Guys, I've just read an ecommerce article that suggests it's a good idea to have a different page for each colour that the product comes in. However surely this will mean duplicate content? What are your thoughts? Have you put this tactic into motion and how did it go? Thanks, Dan
On-Page Optimization | | Sparkstone0 -
Creating a sitemap
what is the best place to go to create a xml sitemap? I have used xml-sitemap.com but it only does 500 pages. Any suggestions? Does a sitemap have value on the SEO front?
On-Page Optimization | | jgmayes0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5