Organizing Website Content
-
Hey everyone!
I am looking for some opinions on organizing website content. Here are two thoughts. I am open to alternative suggestions as well. Thanks for any consideration in this matter!
**Aviation Supply Store Thought A **
**Helicopters Airplanes Rockets **
Helicopter engines Airplane engines Rocket engines
Helicopter Fuels Airplane Fuels Rocket fuels
Helicopter Accessories Airplane Accessories Rocket Accessories
Aviation Supply Store Thought B
Engines Fuels Accessories
Helicopter Engines Helicopter Fuels Helicopter Accessories
Airplane Engines Airplane Fuels Airplane Accessories
Rocket Engines Rocket Fuels Rocket Accessories
I simply chose aviation as an example. I'm just having difficulty deciding on how best to catagorize.
Thank You!
-
Boomajoom, I like that idea. Space is a little bit of an issue right now, but I will keep that thought around for a while. Thank you!
-
I am wondering about the possibility of duplicate content as well. In terms of key word research. They are all pretty strong. Hmmm, I am going to be giving it more thought for now. Thanks again.
-
Hi Mark, thanks for the reply. I was thinking the same thing; although I was also worried a bit about duplicate content issues as Margarita S has mentioned. Titles and Headings could be fairly similar excluding the aircraft type. Could this be a duplicate content issue?
-
Agree with the suggestion about doing keyword research. Assuming that checks out for you, why not categorize according to both ideas? Have the largest volume categories at the top, say thought A, and do the thought B categories in a sidebar?
-
Hi Quilbur,
Honestly, the best suggestion I have for you is to do some keyword research and get a sense of how your target audience is searching for the content you have. I think that will be the most useful way of doing it. Also, assess if there's some overlap on the content you have. Would you be duplicating content if you talk about "helicopter fuels" and "airplane fuels"?
Hope this helps!
MS
-
Quilbur- You always want to categorize by the most specific heading......this will always help you in both paid and organic search results.......
Airplane, helicopter and rocket are much more specific search headings....then your subheadings of engines, fuels and accessories make more sense as secondary keywords........
Hope this helps.
Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is good for SEO update blog post dates after update post content
Hello I am updating some posts of my Blog, adding new and fresh content and rewriting some of the existing. After doing that I am thinking to update de post publishing so that I appears on front page of the blog and user can read ir again. But I don't know if it is good for google to change the publishing date of the post that he had indexed 5 years ago. Also I don't know if google will read it again if it is old and see the new changes in order to improve it in search results
Algorithm Updates | | maestrosonrisas0 -
Ranking drop after image compression across website.
Hi all, Just checked my website in Google pagespeed insights and most of our website pages were required to reduce the images file size for better page loading. So I have compressed the images using https://compressor.io/ and https://tinypng.com/ and replaced the images. Then surprisingly ranking dropped even score improved for all pages with image optimisation. What would be the reason? Thanks
Algorithm Updates | | vtmoz0 -
Do more internal links from sub-domains to domain (website) hurt rankings?
Hi, We have nearly 10 sub-domains. Couple of our website top pages including homepage have been linked from every page of these sub-domains; from footer or top menu. Is this kind of linking is bad as per Google? What is the right way of linking between website and sub-domains?
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Creating Content for Semantic search?
Need some good examples of semantic search friendly content. I have been doing a lot of reading on the subject, but have seen no real good examples of 'this is one way to structure it'. Lots of reading on the topic from an overall satellite perspective, but no clear cut examples I could find of "this is the way the pieces should be put together in a piece of content and this is the most affective ways to accomplish it". **What I know: ** -It needs to answer a question that precludes the 'keyword being used' -It needs to or should be connected to authorship for someone in that topic industry -It should incorporate various social media sources as reference to the topic -It should link out to authoritative resources on the topic -It should use some structured data markup Here is a great resource on the important semantic search pieces: http://www.seoskeptic.com/semantic-seo-making-shift-strings-things/ ,but I want to move past the research into creating the content that will make the connections needed to get the content to rank. I know Storify is an excellent medium to accomplish this off page, but only gives no follow attribution to the topic creator and links their in. I am not a coder, but a marketer and creating the backend markup will really take me out of my wheel house. I don't want to spend all of my time flailing with code when I should be creating compelling semantic content. Any helpful examples or resources welcome. Thanks in advance.
Algorithm Updates | | photoseo10 -
Content, for the sake of the search engines
So we all know the importance of quality content for SEO; providing content for the user as opposed to the search engines. It used to be that copyrighting for SEO was treading the line between readability and keyword density, which is obviously no longer the case. So, my question is this, for a website which doesn't require a great deal of content to be successful and to fullfil the needs of the user, should we still be creating relavent content for the sake of SEO? For example, should I be creating content which is crawlable but may not actually be needed / accessed by the user, to help improve rankings? Food for thought 🙂
Algorithm Updates | | underscorelive0 -
Need help with some duplicate content.
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was. It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this: http://noahsdad.com/resources/ http://noahsdad.com/resources/page/2/ http://noahsdad.com/therapy/page/2/ I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category. What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed." There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories. Any ideas what I should do? Thanks.
Algorithm Updates | | NoahsDad0 -
Why bing is not indexing our website?
We are up almost a six month already, google indexed 46,900 pages. We have decent traffic and a lot of real external links to us. No single page has been indexed by bing or yahoo. I have submitted sitemap to bing's webmaster tool two weeks ago and still it is in Pending stage. here is our address: www.showme.com and here is site map: http://www.showme.com/sitemapxml.php What can be the reason of that? Thanks for your help. Karen Bdoyan
Algorithm Updates | | showme0