What are the advantages and disadvantages of having multiple folders in URL?
-
Example:
http://www.domain.com.ph/property-for-sale/city/area/
(3 folders)
Would it be great if we'll just use http://www.domain.com.ph/property-for-sale-area-city/ (All pages will be under 1 folder)?
Thanks in advance!
-
Thanks Vadim!
-
Similar questions asked before with great responses:
http://moz.com/community/q/url-length
http://moz.com/community/q/maximum-length-of-the-url-for-seo-75-115
Also too many of pages in one directory using: http://www.domain.com.ph/property-for-sale-area-city/ method could lead to server issues, depending on your type of server. But I mean this is if you are creating thousands of urls to one directory, so might not be relevant
Hope this helps
-
As the length of an URL is a relevant ranking factor (short ist better) you just have to check which one is the short one... see this link for actual ranking factors (0.16 as to the Spearman Correlation)
This statement does not cover technical issues of course...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Anything wrong with multiple meta descriptions and multiple title tags? We have 2 by mistake
Hi, As I stated in the we have 2 meta description and title tags. Will this hurts? How Google handles this? Thanks
Algorithm Updates | | vtmoz0 -
Adding non-important folders to disallow in robots.txt file
Hi all, If we have many non-important folders like /category/ in blog.....these will multiply the links. These are strictly for users who access very rarely but not for bots. Can we add such to disallow list in robots to stop link juice passing from them, so internal linking will me minimised to an extent. Can we add any such paths or pages in disallow list? Is this going to work pure technical or any penalty? Thanks, Satish
Algorithm Updates | | vtmoz0 -
What would be the right website hosting solution for a global brand with multiple TLDs?
Hello everyone, we are preparing a global we strategy for our customer, who wants to focus on many local markets. They are present with their products in over 60 countries on all continents and they can cover 7 languages easily. They also own the TLDs in each main country. Our priority is to be not only found in Google, but also in all major local search engines, which are popular in each country. Our main concerns with this strategy are: How do we host the website locally for each country, so local search engines recognize the website as a local website? Also, is it that important? Is cloud hosting the right solution for this? We would have different server locations for each TLD/language. Surrounding countries would at least get a fast connection and download rate, if the server is located nearby. If yes, can you recommend any companies doing cloud hosting? Is it also wise to additionally redirect the user to the local server based on his/her location? Is there a risk of duplicate content or being recognized as a link spamming site (for having links to each other from different domains of the same site, the content will be in different languages, though) just by using this solution? We would, of course, keep the content separated by domain to avoid duplicate content. Any comments and ideas would be highly appreciated. Thanks, Wojtek
Algorithm Updates | | webeeline0 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
URL Importance In Search
This may have been addressed before. If it is, please link me to the thread. I'm trying to SEO for local surrounding cities my client services. It was suggested I purchase domains relevant to those cities and create separate pages optimized for those local keywords. Wondering if this is a good tactic. For example my client's business is located in Chicago, but services the surrounding suburbs of Chicago. Whats the current, best way to SEO?
Algorithm Updates | | severitydesign0 -
How do I rank multiple pages for my busness/domain name?
When someone searches for our business's name (which is also the domain name) we have one listing (with sitelinks) at the top - however I would also like to rank 2nd, 3rd and 4th for this term. Any suggestions on how this might be done? Thanks.
Algorithm Updates | | CaBStudios0 -
Why is my domain URL ranking instead of individual pages?
Hello, Google is ranking my homepage for many keywords instead of showing the various sites pages? Any idea why? Thanks, David
Algorithm Updates | | DavidSpivac0 -
Are multiple domains for my website hurting my Google ranking?
Hello, I currently have two domains showing up in google search: shwoodshop.com shop.shwoodshop.com These domains are currently ranked in the #2 and #3 spot, however my page is much more trafficked than the current #1 ranking. I am wondering if the fact that I have two domains competing for the #1 spot is hurting my search ranking. If so, what is the best way to remedy this issue and get back my #1 spot? I'm rather new to SEO and teaching myself as I go, so I appreciate the feedback!
Algorithm Updates | | shwoodshop0