Real Vs. Virtual Directory Question
-
Hi everyone. Thanks in advance for the assistance. We are reformatting the URL structure of our very content rich website (thousands of pages) into a cleaner stovepipe model. So our pages will have a URL structure something like http://oursite.com/topic-name/category-name/subcategory-name/title.html etc.
My question is… is there any additional benefit to having the path /topic-name/category-name/subcategory-name/title.html literally exist on our server as a real directory? Our plan was to just use HTACCESS to point that URL to a single script that parses the URL structure and makes the page appropriately.
Do search engine spiders know the difference between these two models and prefer one over the other? From our standpoint, managing a single HTACCESS file and a handful of page building scripts would be infinitely easier than a huge, complicated directory structure of real files. And while this makes sense to us, the HTACCESS model wouldn't be considered some kind of black hat scheme, would it?
Thank you again for the help and looking forward to your thoughts!
-
At a fundamental level, you are keeping the data somewhere and it is rendered correctly. In a CMS this data is stored in a database completely outside search engine view. So it does not matter if it is in database or in physical directory somehow. So there is no benefit in keeping the structure same physically.
Having said that and my own experience (we manage website with millions of pages) managing this using HTACCESS script is NOT a good idea. You will be limited by what you can do and maintaining will be quite challenging.
I strongly suggest consider moving to a CMS (like drupal) and store all you content inside a database and the CMS script takes care of HTACCESS plus gives other goodies. There are several tool available to get your content from disk into a database.
-
Search engines can't tell the difference so all good.
-
I believe that the preferred method is in the HTAccess file. When we reformatted the URLs on our site this was the most efficient, cleanest way to do it. This kind of Dynamic Redirect protects you from 404 pages and losing your page values. I didn't see any negative effects using this method of restructure. I had about 6000 pages that each had to change URL, it was a nightmare. We migrated to a completely new platform and file server, so we had to change URLs.
I hope that is helpful. I don't see one method benefiting your engines more than the other. I would suggest doing whatever will be the least amount of work, will be the cleanest way to do it and will in the long run keep your URLs clean and without erroneous information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question regarding Aggregate Rating
We have a directory site with multiple listings. Currently, our page structure is fragmented for each of the tabs (about, products, reviews, etc) with canonicals going back to the main listing page. This includes the reviews as well. Review aggregate is marked up and the stars are rendering in the SERPs. We are planning to break out reviews to /reviews and including a paginated series, then all of the tabs (about, products, NOT reviews) will be javascript loading content so no more fragmented URLs. Right now, I suspect that the stars are rendering on the main listing page because the review page that is currently fragmented has a canonical back to the main listing page. The main listing page also is marked up with the review aggregate. if we break out /reviews, all of the reviews will live on /reviews. If we break out /reviews to it's own URL, will we have to have a small amount of reviews on the main listing page to have the stars render in the SERPs for the main listing page?
White Hat / Black Hat SEO | | imjonny0 -
Submitting url to link directories seen as un-natural link building?
Hi I have been a lurker for a long time, so I finally took the step to make my 1st post, and will hopefully start giving back more in the future since I have gained invaluable info from this great site Background I hired a new freelancer on our team of SEO consultants ("specialists") During the course a month he (the new consultant) submitted our website to numerous link directories (he assured me this is good), today I received the report of the work he had been doing for the past 4-weeks. I opened the report and I was furious and wanted to sack him there and then The Problem / My Question He had submitted our website to 150 directories with various levels of page rank, ranging from 7-1. Most of the directories are totally irrelevant to our niche (we are in the catering business) and he had gone and submitted the site to directories such as "finance busters", "questfinder" etc For all 150 submissions he used: exactly the same url exactly the same title exactly the same description exactly the same keywords My Concern Am I right to be worried about this? Or am I completely wrong and may this actually have an effect (even if none)? The way I see it is that Google is seeing 150 duplicate links coming from irrelevant directories all within a months time, which will trigger a red flag and possibly do major damage to my site, which has always been strictly white hat and been doing pretty well. p.s does link directory submissions even count these days anyway? Thanks for reading and advice very much welcome
White Hat / Black Hat SEO | | timthetanker0 -
Keyword Density Question
Here's my hypothetical. I'm working on a car dealer site. And it's a Chrysler Jeep Dodge Ram dealer. Would "Chrysler Jeep Dodge Ram dealer," count as four keywords rather than one? My goal is to make the website show up for either Chrysler Dealer, Jeep Dealer, et cetera. Thanks!
White Hat / Black Hat SEO | | OOMDODigital0 -
How to know if a link in a directory will be good for my site?
Hi! Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link. Looking at the metrics of that directory, I found the following: Domain Authority: 70; main page authority: 76; linking domain roots: 1383; total links: 94663 (several anchor texts); facebook shares: 26; facebook likes: 14; tweets: 20; Google +1: 15. The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory). Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
White Hat / Black Hat SEO | | te_c0 -
French (Canadian) Directories? Know of any?
There's never any love for Quebec and the horrible french we speak! However, I obviously still need to rank in French SERPs. Anyone know of any good directories or possible backlinks for are Quebec/Montreal/French/Canadian ish? Thanks folks!
White Hat / Black Hat SEO | | deuce1s0 -
Possibly a dumb question - 301 from a banned domain to new domain with NEW content
I was wondering if banned domains pass any page rank, link love, etc. My domain got banned and I AM working to get it unbanned, but in the mean time, would buying a new domain, and creating NEW content that DOES adhere to the google quality guidelines, help at all? Would this force an 'auto-evaluation' or 're-evaluation' of the site by google? or would the new domain simply have ZERO effect from the 301 unless that old domain got into google's good graces again.
White Hat / Black Hat SEO | | ilyaelbert0 -
Farmer Update Case Study. Please question my logic here. (Very long!)
Hi SEOmoz community! I would like to try to give a small (well...) case study of a Farmer victim and some logical conclusions of mine that you are more then welcome to shred to pieces. So, I run MANY sites ranging from low to super quality and actually have a few that have been hit by farmer but this particular site had me scratching my head as to why it was torched. Quick background: Sitei s in a very competetive niche, been around since 2004 initially as a forum site but from 2005 also a content driven site. Site is an affiliate site and has been ranking top 5 for many high-value commercial KW's and has a big long-tail of informational kw's. Limk profile is a mix between natural, good links and purchased links from various qualilty sources. Content is high quality written articles, how-to's, blog posts etc. by in-house pro writers plus UGC from a semi active forum (20-30 posts a day). Farmer: After Farmer, this site's vertical is pretty much same as before with the biggest exception being my site. I quickly discounted low-quality content (spider-food) and focused instead on technical reasons. I took this approach since this site isn't the most well kept site I have and I figured the crappy CMS + PHPBB might have caused isseus. I didn't want to waste my time crawling the site myself so I quickly downloaded all the URLs that Majestic had crawled. Too my surprise the result of Majestic's crawler was over 3 million URLs when the real number would likley be 30-40k and Google has about 20k indexed. After scanning through the file with URLs I knew I had issues. Massive amounts of auto-generated dupe pages from the forum and so on. By adding around 20 new lines to robots.txt I was able to block millions of pages from being crawled again. My logic: Ok, so now I think I've found what caused the drop. Milllions of dupe pages and empty pages could have tripped the Farmer algo update to think the site is low quality or dupe or just trying to feed the spiders with uselessness. My WEAK point in this logic is that I can't prove that Google even knew about (or smart enough to ignore them). Google WMT tells me they've crawled an average of around 10k pages the last 90 days. Given this I'm doubting my logic and if I've found the issue or not. My next step is to see if this gets resolved algorithmically or not, if not i feel I have a legitimate case to submit a reinclusion request but i'm not sure? Since I haven't been a contributing member to this community I'm not looking to get direct help with my site, but hopefully this could spark some discussion about Farmer and maybe some flaming of my logic regarding the update 🙂 So, would any of you have drawn similar conclusions as I did? (Sweet blog bro!)
White Hat / Black Hat SEO | | YesBaby0