Real Vs. Virtual Directory Question
-
Hi everyone. Thanks in advance for the assistance. We are reformatting the URL structure of our very content rich website (thousands of pages) into a cleaner stovepipe model. So our pages will have a URL structure something like http://oursite.com/topic-name/category-name/subcategory-name/title.html etc.
My question is… is there any additional benefit to having the path /topic-name/category-name/subcategory-name/title.html literally exist on our server as a real directory? Our plan was to just use HTACCESS to point that URL to a single script that parses the URL structure and makes the page appropriately.
Do search engine spiders know the difference between these two models and prefer one over the other? From our standpoint, managing a single HTACCESS file and a handful of page building scripts would be infinitely easier than a huge, complicated directory structure of real files. And while this makes sense to us, the HTACCESS model wouldn't be considered some kind of black hat scheme, would it?
Thank you again for the help and looking forward to your thoughts!
-
At a fundamental level, you are keeping the data somewhere and it is rendered correctly. In a CMS this data is stored in a database completely outside search engine view. So it does not matter if it is in database or in physical directory somehow. So there is no benefit in keeping the structure same physically.
Having said that and my own experience (we manage website with millions of pages) managing this using HTACCESS script is NOT a good idea. You will be limited by what you can do and maintaining will be quite challenging.
I strongly suggest consider moving to a CMS (like drupal) and store all you content inside a database and the CMS script takes care of HTACCESS plus gives other goodies. There are several tool available to get your content from disk into a database.
-
Search engines can't tell the difference so all good.
-
I believe that the preferred method is in the HTAccess file. When we reformatted the URLs on our site this was the most efficient, cleanest way to do it. This kind of Dynamic Redirect protects you from 404 pages and losing your page values. I didn't see any negative effects using this method of restructure. I had about 6000 pages that each had to change URL, it was a nightmare. We migrated to a completely new platform and file server, so we had to change URLs.
I hope that is helpful. I don't see one method benefiting your engines more than the other. I would suggest doing whatever will be the least amount of work, will be the cleanest way to do it and will in the long run keep your URLs clean and without erroneous information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question RE: Links in Headers, Footers, Content, and Navigation
This question is regarding this Whiteboard Friday from October 2017 (https://moz.com/blog/links-headers-footers-navigation-impact-seo). Sorry that I am a little late to the party, but I wanted to see if someone could help out. So, in theory, if header links matter less than in-content links, and links lower on the page have their anchor text value stripped from them, is there any point of linking to an asset in the content that is also in the header other than for user experience (which I understand should be paramount)? Just want to be clear.Also, if in-content links are better than header links, than hypothetically an industry would want to find ways to organically link to landing pages rather than including that landing page in the header, no? Again, this is just for a Google link equity perspective, not a user experience perspective, just trying to wrap my head around the lesson. links-headers-footers-navigation-impact-seo
White Hat / Black Hat SEO | | 3VE0 -
Silo architecture and PR dilution! What's real?
Hi all, Today I have gone through this "Silo" concept where we need to build 2nd hierarchy level pages and then lower hierarchy pages further to rank good for related terms of "keyword(s)". But I wonder, is it real? the so called Silo structure? Google may consider that we are trying trick if we create multiple pages (doorway pages) targeting same keyword. And one of my competitors is having too many 2nd hierarchy level pages against this Silo structure and even the homepage rank may dilute by contributing to the so many pages. But their web pages rank good for the keywords they chosen by creating multiple landing pages. These are contrary to each other. How it works in real? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Legit Editorial Placement vs Penalized Guest Posting
I'm planning to begin contributing to several different media outlets and blogs on the net, and hoping that I can get some decent placements for me and a few of my colleagues. Looking specifically at legit media outlets and corporate blogs with a structured and considered editorial process where we can contribute thought leadership pieces. In light of all of the Google algorithm changes surrounding guest blogging, I am curious if this would be viewed as legit editorial placements, or as guest posts that would either carry no weight or be penalized? Secondly, what are the considerations and value of including a high quality in-article link back to our site vs. a byline link, or both. Does anyone have any data or experience with this? Thanks in advance! Andrew and wondering if anyone has any experience or insights
White Hat / Black Hat SEO | | Alaniz1 -
URL Shortners Question
Does anyone know if there are any URL shortners that track when googlebot visits them? I want to know when googlebot visits a shortened link that does NOT got to a URL I control. Any ideas would be much appreciated.
White Hat / Black Hat SEO | | gazzerman10 -
Navigation for Users vs Spiders
We're creating a new global site nav that provides a great user experience, but may be less than ideal for the search engines. The user selects an item from category A, and is then presented options to choose from in category B, and then chooses a specific product. The user does not encounter any actual "links" until they choose the specific product. The search engines won't see this navigation path due to the way that the navigation is coded. They're unable to choose an item from A, so they can't get to B, and therefore cannot get to C, which is the actual product page. We'd like to create an alternative nav for the browsers, so that they can crawl the category pages for A and B, as well as the specific product pages (C). This alternative nav would be displayed if the user does not have javascript enabled. Otherwise, the navigation described above will be shown to the user. Moving forward, the navigation that the user sees may be different from what is shown to the search engine, based on user preferences (ie they may only see some of the categories in the nav, while the search engines will see links to all category/product pages). I know that, as a general rule, it's important that the search engines see the same thing that the user sees. Does the strategy outlined above put us at risk for penalties?
White Hat / Black Hat SEO | | edmundsseo0 -
Are links from directories good or bad?
I've done a lot of competitive link analysis lately and found that a lot of my competitors links for a certain keyword are coming from low quality directory sites and they're outranking my site. This leads me to my question which may or may not have an answer(I at least hope it fuels a good discussion)... Are links from directory sites good or bad for SEO?
White Hat / Black Hat SEO | | TylerReardon0 -
Explain To Me How Negative SEO ISNT Real?
I'm seeing lots of "offers" springing up to do negative SEO on your competitors. I know people keep insisting this sort of thing is just a bogeyman, but follow my logic here: We know the Penguin update PENALIZED, and not just devalued "over optimization." Read: exact match keyword links. We know that if your link profile is too "unnaturally" keyword heavy, (it should be majority your brand or your domain or your company name, etc) you get penalized. Again, not devalued, PENALIZED. Ok. So what is to stop a blackhatter from using one of those software bots to just kill a competitor? Knowing the above two points, lets say a website is ranking for "cool widgets". Why not just create a bunch of exact match keyword spam links for "cool widgets" targeting that website. In a while, the Penguin penalty kicks in and bammo. The thing that scares me about the post Penguin landscape is that google has specifically named an activity ("over optimization") that will get you PENALIZED. So, don't do that, right? Except, that means they've explicitly outlined an activity that will be penalized, and is easy for others to do to you, and that you would be powerless to prevent. I await the usual "this is an age old worry that has never come true" replies. But if you reply that way, ask yourself, can you refute the logic of the points above? And also... oh no... It's happening. I'm seeing it.
White Hat / Black Hat SEO | | brianmcc1 -
Possibly a dumb question - 301 from a banned domain to new domain with NEW content
I was wondering if banned domains pass any page rank, link love, etc. My domain got banned and I AM working to get it unbanned, but in the mean time, would buying a new domain, and creating NEW content that DOES adhere to the google quality guidelines, help at all? Would this force an 'auto-evaluation' or 're-evaluation' of the site by google? or would the new domain simply have ZERO effect from the 301 unless that old domain got into google's good graces again.
White Hat / Black Hat SEO | | ilyaelbert0