Is placing content in sub directories better for SERP
-
Hi
For small web sites with less than 6 pages
Is there a benefit to structuring url paths using keyword rich sub directories compared to pages in the root of the site.
for example:
domainname.co.uk/keywordpagename.html
or
www.domainname.co.uk/keyword/keywordpagename.html
which seems to have better rankings?
thanks
keyword
-
I would also base it on the site content. Does the site content (product listing) warrant an extra sub directory to be created. For example if you are selling cars, then you can have a directory for Honda and then another subdirectory for the specific make with the keyword Honda in it.
In your case since the site only has 6 pages, it would not make sense to create the subdirectories.
-
Given the size of the proposed site in my view it's very unlikely that placing an extra keyword in the sub-directory would be of any benefit to ranking.
If the site is going to grow then it would make sense to nest related content within a number of relevant 'parent' sub-directories, but if it's going to stay at that size, a flat structure is probably going to be equally effective.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
HI There, Hoping someone can help me - before i damage my desk banging my head. Getting notifications from ahrefs and Moz for duplicate content. I have no idea where these weird urls have came from , but they do take us to the correct page (but it seems a duplicate of this page). correct url http://www.acsilver.co.uk/shop/pc/Antique-Vintage-Rings-c152.htm Incorrect url http://www.acsilver.co.uk/shop/pc/vintage-Vintage-Rings- c152.htm This is showing for most of our store categories 😞 Desperate for help as to what could be causing these issues. I have a technical member of the ecommerce software go through the large sitemap files and they assured me it wasn't linked to the sitemap files. Gemma
Technical SEO | | acsilver0 -
Duplicate content on job sites
Hi, I have a question regarding job boards. Many job advertisers will upload the same job description to multiple websites e.g. monster, gumtree, etc. This would therefore be viewed as duplicate content. What is the best way to handle this if we want to ensure our particular site ranks well? Thanks in advance for the help. H
Technical SEO | | HiteshP0 -
Forum on a Sub-domain - Thin Content?
I have wordpress blog installed on my Domain and now I intent to start a Forum. I understand that the content on the forum would be thin-content which may attract Google Penalties. So, would it be wise to start the forum on a sub-domain to avoid any penalty. My query is:- 1. If the content on the sub-domain is thin, can it impact my main domain as well. 2. Should I install the forum on a sub-domain or an entirely different domain so as to avoid any penalty? My preference is a sub-domain provided google does not levy any penalty I also intent to display RSS Feeds of the Forum on the Home Page of the Website.
Technical SEO | | cakaranbatra0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
Avoiding Cannibalism and Duplication with content
Hi, For the example I will use a computers e-commerce store... I'm working on creating guides for the store -
Technical SEO | | BeytzNet
How to choose a laptop
How to choose a desktop I believe that each guide will be great on its own and that it answers a specific question (meaning that someone looking for a laptop will search specifically laptop info and the same goes for desktop). This is why I didn't creating a "How to choose a computer" guide. I also want each guide to have all information and not to start sending the user to secondary pages in order to fill in missing info. However, even though there are several details that are different between the laptops and desktops, like importance of weight, screen size etc., a lot of things the checklist (like deciding on how much memory is needed, graphic card, core etc.) are the same. Please advise on how to pursue it. Should I just write two guides and make sure that the same duplicated content ideas are simply written in a different way?0 -
External video content in iframe
Hi, On our site we have a lot of video content. The player is hosted by a third party so we are using an iframe to include the content on our site. The problem is that the content it self (on the third party domain) is shown in the google result. My question is: Can we ask the third party to disallow the content from indexing in their robots.txt or will that also affect our own use of the video content? For example we use video-sitemaps to include the videos in Google video search (the sitemap links to the videos on our own domain, but we are still using iframes on the pages to collect the content from the third party domain that will then be blocked by robots.txt). I hope you understand what I mean... Any suggestions? Thanks a lot!
Technical SEO | | Googleankan0 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0