Real Vs. Virtual Directory Question
-
Hi everyone. Thanks in advance for the assistance. We are reformatting the URL structure of our very content rich website (thousands of pages) into a cleaner stovepipe model. So our pages will have a URL structure something like http://oursite.com/topic-name/category-name/subcategory-name/title.html etc.
My question is… is there any additional benefit to having the path /topic-name/category-name/subcategory-name/title.html literally exist on our server as a real directory? Our plan was to just use HTACCESS to point that URL to a single script that parses the URL structure and makes the page appropriately.
Do search engine spiders know the difference between these two models and prefer one over the other? From our standpoint, managing a single HTACCESS file and a handful of page building scripts would be infinitely easier than a huge, complicated directory structure of real files. And while this makes sense to us, the HTACCESS model wouldn't be considered some kind of black hat scheme, would it?
Thank you again for the help and looking forward to your thoughts!
-
At a fundamental level, you are keeping the data somewhere and it is rendered correctly. In a CMS this data is stored in a database completely outside search engine view. So it does not matter if it is in database or in physical directory somehow. So there is no benefit in keeping the structure same physically.
Having said that and my own experience (we manage website with millions of pages) managing this using HTACCESS script is NOT a good idea. You will be limited by what you can do and maintaining will be quite challenging.
I strongly suggest consider moving to a CMS (like drupal) and store all you content inside a database and the CMS script takes care of HTACCESS plus gives other goodies. There are several tool available to get your content from disk into a database.
-
Search engines can't tell the difference so all good.
-
I believe that the preferred method is in the HTAccess file. When we reformatted the URLs on our site this was the most efficient, cleanest way to do it. This kind of Dynamic Redirect protects you from 404 pages and losing your page values. I didn't see any negative effects using this method of restructure. I had about 6000 pages that each had to change URL, it was a nightmare. We migrated to a completely new platform and file server, so we had to change URLs.
I hope that is helpful. I don't see one method benefiting your engines more than the other. I would suggest doing whatever will be the least amount of work, will be the cleanest way to do it and will in the long run keep your URLs clean and without erroneous information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Clean-up Question after a wordpress site Hack added pages with external links from a massive link wheel?
Hey All, Thought I would throw this out to ensure I am dotting my "i's" and crossing my "t's"..... Client WordPress site was hacked injected 3-4 pages that cross linked to hundreds (affiliate junk spam link wheel). Pages were removed, 3rd party cleared all malware/viruses. Heavy duty firewall and security monitoring are in place. Hacked pages are now showing as 404. No penalties, ranking issues....If anything there was a temporary BOOST in rankings due to the large link-wheel type net that the pages were receiving....That has since leveled out rankings. I guess my question is, in your opinion is it best to let those pages 404, I am noticing a large amount of links going to them from all over the world from this large link net that was built. I find the temptation to 301 re-direct deleted pages to the homepage difficult...lol..{the temptation is REAL}. Is there anything I am missing? Any other steps that YOU would take? I am assuming letting those pages 404 would be the best bet, as in time they will roll off index.... Thank you in advance, I appreciate any feedback or opinions....
White Hat / Black Hat SEO | | Anthony_Howard0 -
Can anyone explain some below SEO questions ?
Can we do link building like directory, article, press releases, classifieds, business listing, social bookmarking etc. We need to check DA, Alexa, Page Rank, cBlock IP before publishing any kind of Link but how much Max or Min. no. should be for consider any website like DA should be min 20-30-40 etc.. How can consider a natural links? Which type anchor text should be in any kind of links may be directory etc. In website interlinking we should put Exact Links or no need to put any links For.ex.my website is abc.com.au then we can put link for Website Design keywords or Should be long tail keyword. How can we do content marketing means we should post blog in internal website or need to create External Blog like BlogSpot, WordPress. In blog we should put any keyword link OR should be post without links. We can put link on no follow website. Why more website coming on Google first page but they are doing Spammy links like exact keywords links, unnatural links etc.. Thanks, Akhilesh
White Hat / Black Hat SEO | | dotlineseo0 -
Keyword Phrase vs. separate keywords - Title Tag best practices
Hello, What is your opinion about when to use a keyword phrase vs. 2 keywords, separated by a comma, in the title tag? For example, on this page, the title could be either: NLP Hypnosis, Language Patterns | Nlpca.com or NLP and Hypnosis Including Language Patterns | Nlpca.com Which do you guys think is best with respect to rankings, updates, and future updates?
White Hat / Black Hat SEO | | BobGW0 -
Questionable backlinks...
One of our competitors (who are ranking top spot ) have this trend of building backlinks from websites build for the sole purpose of seo. (see example) When you see the website it's just a submission of articles from different companies trying to rank for a certain keyword most of the time poorly written. Our competitor seems to be doing this a lot...
White Hat / Black Hat SEO | | Immanuel
What do you guys think, is it just a matter of time before Google cracks down on them or is this technique actually working for them? (even though it's rather grey hat) Or... could it be someone trying to build "poor" backlinks to them in an attempt to push them of the Google throne 😉1 -
To Disavow or Not to Disavow - that is the Question!
I have had two SEO 'specialists' look at my site after the 2012 Penguin update as I was hit badly for one very important keyword. I took off any bad sites links but I never did anything with inbound links. One says my link profile is fine and do NOT use the disavow tool but I should improve my site (landing pages, content, photos, put blog on site, articles, social media etc etc). This I tried for several months but my site never improved. the second 'expert' said I HAVE to take certain ones off and he identified inbound links from spammy sites. He found links from 65 links from malware/untrusted sites and 267 from spam articles and 124 from link farms plus hundreds more from pages that no longer exist or never provide traffic What would you do? i should point out the anchor text for these inbound links is the one keyword that is the most important to the site and the one that got hit by the Penguin 2012
White Hat / Black Hat SEO | | Llanero0 -
How to know if a link in a directory will be good for my site?
Hi! Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link. Looking at the metrics of that directory, I found the following: Domain Authority: 70; main page authority: 76; linking domain roots: 1383; total links: 94663 (several anchor texts); facebook shares: 26; facebook likes: 14; tweets: 20; Google +1: 15. The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory). Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
White Hat / Black Hat SEO | | te_c0 -
Subdomains vs. Subfolders Wordpress Multisite
I am in the process redesigning my organization's website using wordpress multisite. I am currently planning on creating subdomains for each of the locations, as I thought that having a keyword saturated domain name would provide the best rankings. So the Omaha office would look like this: omaha.example.com Would it be better to go with example.com/omaha? Things to consider: Google adwords is currently a huge source of our traffic. Despite having very good organic rankings, we receive most of our traffic from pay-per-click sources. The "display URL" has dramatic effect on our CTR, so I want to avoid subfolders if possible. (example OmahaEmergencyDental.com receives far more click thru's than EmergencyDental.com) Each location currently has it's own domain and website (omahaemergencydental.com) these sites/pages have been in place for several years Thanks in advance!
White Hat / Black Hat SEO | | LoganYard0