Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Real Vs. Virtual Directory Question
-
Hi everyone. Thanks in advance for the assistance. We are reformatting the URL structure of our very content rich website (thousands of pages) into a cleaner stovepipe model. So our pages will have a URL structure something like http://oursite.com/topic-name/category-name/subcategory-name/title.html etc.
My question is… is there any additional benefit to having the path /topic-name/category-name/subcategory-name/title.html literally exist on our server as a real directory? Our plan was to just use HTACCESS to point that URL to a single script that parses the URL structure and makes the page appropriately.
Do search engine spiders know the difference between these two models and prefer one over the other? From our standpoint, managing a single HTACCESS file and a handful of page building scripts would be infinitely easier than a huge, complicated directory structure of real files. And while this makes sense to us, the HTACCESS model wouldn't be considered some kind of black hat scheme, would it?
Thank you again for the help and looking forward to your thoughts!
-
At a fundamental level, you are keeping the data somewhere and it is rendered correctly. In a CMS this data is stored in a database completely outside search engine view. So it does not matter if it is in database or in physical directory somehow. So there is no benefit in keeping the structure same physically.
Having said that and my own experience (we manage website with millions of pages) managing this using HTACCESS script is NOT a good idea. You will be limited by what you can do and maintaining will be quite challenging.
I strongly suggest consider moving to a CMS (like drupal) and store all you content inside a database and the CMS script takes care of HTACCESS plus gives other goodies. There are several tool available to get your content from disk into a database.
-
Search engines can't tell the difference so all good.
-
I believe that the preferred method is in the HTAccess file. When we reformatted the URLs on our site this was the most efficient, cleanest way to do it. This kind of Dynamic Redirect protects you from 404 pages and losing your page values. I didn't see any negative effects using this method of restructure. I had about 6000 pages that each had to change URL, it was a nightmare. We migrated to a completely new platform and file server, so we had to change URLs.
I hope that is helpful. I don't see one method benefiting your engines more than the other. I would suggest doing whatever will be the least amount of work, will be the cleanest way to do it and will in the long run keep your URLs clean and without erroneous information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question regarding subdomains and duplicate content
Hey everyone, I have another question regarding duplicate content. We are planning on launching a new sector in our industry to satisfy a niche. Our main site works as a directory with listings with NAP. The new sector that we are launching will be taking all of the content on the main site and duplicating it on a subdomain for the new sector. We still want the subdomain to rank organically, but I'm having struggles between putting a rel=canonical back to main site, or doing a self-referencing canonical, but now I have duplicates. The other idea is to rewrite the content on each listing so that the menu items are still the same, but the listing description is different. Do you think this would be enough differentiating content that it won't be seen as a duplicate? Obviously make this to be part of the main site is the best option, but we can't do that unfortunately. Last question, what are the advantages or disadvantages of doing a subdomain?
White Hat / Black Hat SEO | | imjonny0 -
Good vs Bad Web directories
Hi this blog post Rand mentions a list of bad web directories - I asked couple of years ago if there is an updated list as some of these (Alive Directory for example) do not seem to be blacklisted anymore and are coming up in Google searches etc? It seems due to old age of the blog post (7 years ago ) the comments are not responded to. Would anyone be able to advise if which of these good directories to use? https://moz.com/blog/what-makes-a-good-web-directory-and-why-google-penalized-dozens-of-bad-ones
White Hat / Black Hat SEO | | IsaCleanse0 -
Macrae's Blue Book Directory LIsting
Does anyone know more information about this directory? Is it a good quality directory that I should pay to get listed on?
White Hat / Black Hat SEO | | EcomLkwd0 -
Google places VS position one ranking above the places.
Hi Guys, Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword? I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts? Or would they have that listing as well as the places listing? I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc. Appreciate some guidance Thanks. BC
White Hat / Black Hat SEO | | Bodie0 -
Niche Directories
Hello, I like the concept that a directory is good only if you would still want the link if it passed no link juice. DMOZ and Best of the Web fall into that category. And so do some niche directories. How do you determine whether to go with a niche directory or not? Also, what's the ratio of how many niche directory links you'd want compared to other types of links (being very safe) 3:1? 4:1? 5:1? Does it matter what your other links are? Thanks.
White Hat / Black Hat SEO | | BobGW1 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
Subdomains vs. Subfolders Wordpress Multisite
I am in the process redesigning my organization's website using wordpress multisite. I am currently planning on creating subdomains for each of the locations, as I thought that having a keyword saturated domain name would provide the best rankings. So the Omaha office would look like this: omaha.example.com Would it be better to go with example.com/omaha? Things to consider: Google adwords is currently a huge source of our traffic. Despite having very good organic rankings, we receive most of our traffic from pay-per-click sources. The "display URL" has dramatic effect on our CTR, so I want to avoid subfolders if possible. (example OmahaEmergencyDental.com receives far more click thru's than EmergencyDental.com) Each location currently has it's own domain and website (omahaemergencydental.com) these sites/pages have been in place for several years Thanks in advance!
White Hat / Black Hat SEO | | LoganYard0 -
Using an auto directory submission
Has anyone used easysubmits.com and what's your experience with it? Any other directory submission or link building tools that help automate and manage the process like easysubmits.com says they can do? I'm just looking at it currenlty and wanted to hear others thoughts before I get taken in by some black hat method that hurts my websites instead.
White Hat / Black Hat SEO | | Twinbytes0