Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
All page files in root? Or to use directories?
-
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /.
For example:
/aosta-valley-i6816.html
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.htmlWe are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following:
/images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.htmlWould we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory:
/images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/
Just looking for some clarity to our problem!
Thank you for your help guys!
-
To my knowledge there hasn't been a definitive conclusion on this one.
The general advice as I know it seems to be: they are equally good, pick one, and make sure the other one (with slash if you choose to go for 'without slash' or vice versa) redirects to the chosen one (to avoid duplicate content).
-
I would personally place the keywords at the end for clarity. It indeed seems unnatural to have the id as the final part of the URL. Even if that does indeed cost you a tiny bit of 'keyword power', I would glady sacrifice that in exchange for a more user-friendly URL.
Limiting the amount of words in the URL does indeed make it look slightly less spammy, but slightly less user friendly as well. I guess this is just one of those 'weigh the pros/cons and decide for yourself'. Just make sure the URLs don't get rediculously long.
-
OK, so I have taken it upon myself to now have our URLs as follows:
/news/853/free-flight-simulator/
Anything else gets 301'd to the correct URL. /news/853/free-flight-simulator would be 301'd to /news/853/free-flight-simulator/ along with /news/853/free-flight-sifsfsdfdsfmulator/ ... etc.
-
Also, trailing slash? Or no trailing slash?
Without
/downloads/878/fsx-concorde
With
/downloads/878/fsx-concorde/
-
Dear Theo,
Thank you for your response - i found your article very interesting.
So, just to clarify - in our case, the best URL method would be:
/images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/This would remove the suffixes and also have the ID numbers at the end; placing the target keywords closer to the root of the URL; which makes a very slight difference...
EDIT: Upon thinking about it, I feel that the final keyword-targeted page would be more natural if it appeared at the end of the URL. For example: /images/6816/aosta-valley/ (like you have done on your blog).
Also, should I limit the amount of hyphenated words in the URL? For example in your blog, you have /does-adding-a-suffix-to-my-urls-affect-my-seo/ - perhaps it would be more concentrated and less spammy as /adding-suffix-urls-affect-seo/ ?
Let me know your thoughts.
Thank you for your help!
-
Matt Cutts states that the number of subfolders 'it is not a major factor': http://www.youtube.com/watch?v=l_A1iRY6XTM
Furthermore, a blog I wrote about removing suffixes: http://www.finishjoomla.com/blog/5/does-adding-a-suffix-to-my-urls-affect-my-seo/
Another Matt Cutts regarding your seperate question about the keyword order: http://www.youtube.com/watch?v=gRzMhlFZz9I
Having some structure (in the form of a single subfolder) would greatly add to the usability of your website in my opinion. If you can manage to use the correct redirects (301) from your old pages to your new ones, I wouldn't see a clear SEO related reason not to switch.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Merging Pages and SEO
Hi, We are redesigning our website the following way: Before: Page A with Content A, Page B with Content B, Page C with Content C, etc
Intermediate & Advanced SEO | | viatrading1
e.g. one page for each Customer Returns, Overstocks, Master Case, etc
Now: Page D with content A + B + C etc.
e.g. one long page containing all Product Conditions, one after the other So we are merging multiples pages into one.
What is the best way to do so, so we don't lose traffic? (or we lose the minimum possible) e.g. should we 301 Redirect A/B/C to D...?
Is it likely that we lose significant traffic with this change? Thank you,0 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Using subdomains for related landing pages?
Seeking subdomain usage and related SEO advice... I'd like to use multiple subdomains for multiple landing pages all with content related to the main root domain. Why?...Cost: so I only have to register one domain. One root domain for better 'branding'. Multiple subdomains that each focus on one specific reason & set of specific keywords people would search a solution to their reason to hire us (or our competition).
Intermediate & Advanced SEO | | nodiffrei0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Using 2 wildcards in the robots.txt file
I have a URL string which I don't want to be indexed. it includes the characters _Q1 ni the middle of the string. So in the robots.txt can I use 2 wildcards in the string to take out all of the URLs with that in it? So something like /_Q1. Will that pickup and block every URL with those characters in the string? Also, this is not directly of the root, but in a secondary directory, so .com/.../_Q1. So do I have to format the robots.txt as //_Q1* as it will be in the second folder or just using /_Q1 will pickup everything no matter what folder it is on? Thanks.
Intermediate & Advanced SEO | | seo1234560 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1