Real Vs. Virtual Directory Question
-
Hi everyone. Thanks in advance for the assistance. We are reformatting the URL structure of our very content rich website (thousands of pages) into a cleaner stovepipe model. So our pages will have a URL structure something like http://oursite.com/topic-name/category-name/subcategory-name/title.html etc.
My question is… is there any additional benefit to having the path /topic-name/category-name/subcategory-name/title.html literally exist on our server as a real directory? Our plan was to just use HTACCESS to point that URL to a single script that parses the URL structure and makes the page appropriately.
Do search engine spiders know the difference between these two models and prefer one over the other? From our standpoint, managing a single HTACCESS file and a handful of page building scripts would be infinitely easier than a huge, complicated directory structure of real files. And while this makes sense to us, the HTACCESS model wouldn't be considered some kind of black hat scheme, would it?
Thank you again for the help and looking forward to your thoughts!
-
At a fundamental level, you are keeping the data somewhere and it is rendered correctly. In a CMS this data is stored in a database completely outside search engine view. So it does not matter if it is in database or in physical directory somehow. So there is no benefit in keeping the structure same physically.
Having said that and my own experience (we manage website with millions of pages) managing this using HTACCESS script is NOT a good idea. You will be limited by what you can do and maintaining will be quite challenging.
I strongly suggest consider moving to a CMS (like drupal) and store all you content inside a database and the CMS script takes care of HTACCESS plus gives other goodies. There are several tool available to get your content from disk into a database.
-
Search engines can't tell the difference so all good.
-
I believe that the preferred method is in the HTAccess file. When we reformatted the URLs on our site this was the most efficient, cleanest way to do it. This kind of Dynamic Redirect protects you from 404 pages and losing your page values. I didn't see any negative effects using this method of restructure. I had about 6000 pages that each had to change URL, it was a nightmare. We migrated to a completely new platform and file server, so we had to change URLs.
I hope that is helpful. I don't see one method benefiting your engines more than the other. I would suggest doing whatever will be the least amount of work, will be the cleanest way to do it and will in the long run keep your URLs clean and without erroneous information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitors in directories according to OSE, should I submit for the same?
Hey Mozers! I have just began my link building process, firstly creating unique and quality content and looking at my competitors. I have found that much of my competitors are in listed in many directories. Some are paid and some are not. I noticed the paid ones had higher DA which seems appealing, however I'm stuck to think wether or not it is relevant to my site and if my audience will go there to search my services. But then I'm thinking well if my competitors are there then why is it so? etc Does anyone know if this is something I should look at? My site has been live since november last year and we only have 1 backlink at the moment according to moz...We are on the writing wagon to filter our content and make sure were writing good engaging content however I'm seeing this obviously not the only way. I have also read the beginners guide to backlinks via moz and researched and read other interesting ways, including blogger outreach within my niche. Any advice around this approach?
White Hat / Black Hat SEO | | edward-may0 -
Are there any decent directories?
Hi, I'm looking for some decent directories. I was just wondering if anyone knows of a big list of good ones? If you have a list on financial / money related ones that would be absolutely fabulous but I know that is moon on a stick stuff... I'm just sick of the naff ones! Thanks, Amelia PS - I DO have other link building methods besides directories.
White Hat / Black Hat SEO | | CommT0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Keyword Density Question
Here's my hypothetical. I'm working on a car dealer site. And it's a Chrysler Jeep Dodge Ram dealer. Would "Chrysler Jeep Dodge Ram dealer," count as four keywords rather than one? My goal is to make the website show up for either Chrysler Dealer, Jeep Dealer, et cetera. Thanks!
White Hat / Black Hat SEO | | OOMDODigital0 -
Rel Noindex Nofollow tag vs meta noindex nofollow robots
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. PSS: another reason why this needs looking at is because search engines won't be able to make an interpretation of these pages (until they have been cleaned up and fleshed out with unique content) which could result in bad ranking of the pages which could conclude to my users not being satisfied, so over and above the SEO factor, usability of the site is being looked at here as well, I don't want my users to land on these pages atm. If they navigate to it via the filters then awesome because they are defining what they are looking for with the filters. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
Is Best of the Web a good directory to pay to be listed on?
We are currently paying to have a listing in the directory Best of the Web. Should I be paying to renew our listing in this directory?
White Hat / Black Hat SEO | | djlittman0 -
Question #1 - My Cherry's Popped!
I recently acquired rights to a URL that is one of our keywords. Instead of developing a landing page with that URL and then only linking it back to the company root, I was thinking about adding a link within the company's global nav that pushes to this new URL (and new page content of course). Are there any Pros or Cons to doing it that way? Thank you so much!
White Hat / Black Hat SEO | | GladdySEO0 -
Link-Building - Directories
Hello, The SEO world is a bit confuse in the last months with the Google Antartic updates. Its normal since Google is trying to kill SEO to have more Adwords publicity results. My most recent doubt is about directories. I heard Matt Cutts from Google in a recent Google Hangout saying that registering a website in directorys was ok, but not the ideal method to become relevant in the internet world. However it seems that this procedure is not against the Google policies. Now, here in the forums, I already saw someone writing about adding your site to directories and how dangerous that situacion is. So, whats your opinion about adding your site to free and pay directories as first link-building strategy? If directories are out of the question, why SEOmoz as a huge list of paid directorys? Is SEOmoz outdate?
White Hat / Black Hat SEO | | PedroM1