URL structure
-
Hello Guys,
Quick Question regarding URL strucutre
One of our client is an hotel chain, thye have a group site www.example.com and each property is located in a subfolder: www.example.com/example-boston.html , www.example.com/example-ny.html etc.
My quesion is : where is better to place the language extension at a subfolder level?
Should i go for www.example.com/en/example-ny.html or it is preferable to specify the language after the property name www.example.com/example-ny/en/accommodation.html?Thanks and Regards,
Alessio
-
I personally prefer the language subfolder closer to the domain e.g. www.example.com/en/example-ny.html but really it would all depend on what parts of your site you provide in alternate languages. If you provide only pages past the sub-directory **/example-ny/ **I would place it after this directory. But I highly doubt you only offer alternate languages on only the pages that provide details about the property.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same URL, different Drupal content types
Hi all, I am working in Drupal which isn't always SEO-friendly. I want to convert some of our articles that are currently in an old article type to our new shiny longform template without losing SEO value. The process we use right now is to: change the URL of the old article in the CMS from /article-title to /article-title-old and then make the longform template /article-title in the CMS. Then hit publish. That way we can avoid having to mess with redirects. My concerns are that this will be seen as a bait and switch by Google. They are, after all, two separate pages — node-1 and node-2 on the back end — that are being smushed into the same skin aka same URL. I don't know if updating to the new template wipes out some of the info Google may have deemed important. I guess you could argue it's a redesign by CMS but I'm still not sure. Thoughts?
Technical SEO | | webbedfeet0 -
URL Redirect
Hi All, So we have employees who can own their own domains for business, however, one employee has a domain that links back to our main site, but when it does, the URL and Page title of our main site, still say his own domain. IE: www.johndoe.com links to www.mysite.com except the url and itle still say www.johndoe.com What are the implications of this? Thank you
Technical SEO | | PeteEllard0 -
URL not indexed but shows in results?
We are working on a site that has a whole section that is not indexed (well a few pages are). There is also a problem where there are 2 directories that are the same content and it is the incorrect directory with the indexed URLs. The problem is if I do a search in Google to find a URL - typically location + term then I get the URL (from the wrong directory) up there in the top 5. However, do a site: for that URL and it is not indexed! What could be going on here? There is nothing in robots or the source, and GWT fetch works fine.
Technical SEO | | MickEdwards0 -
Redirect URLS with 301 twice
Hello, I had asked my client to ask her web developer to move to a more simplified URL structure. There was a folder called "home" after the root which served no purpose. I asked for the URLs to be redirected using 301 to the new URLs which did not have this structure. However, the web developer didn't agree and decided to just rename the "home" folder "p". I don't know why he did this. We argued the case and he then created the URL structure we wanted. Initially he had 301 redirected the old URLS (the one with "Home") to his new version (the one with the "p"). When we asked for the more simplified URL after arguing, he just redirected all the "p" URLS to the PAGE NOT FOUND. However, remember, all the original URLs are now being redirected to the PAGE NOT FOUND as a result. The problems I see are these unless he redirects again: The new simplified URLS have to start from scratch to rank 2)We have duplicated content - two URLs with the same content Customers clicking products in the SERPs will currently find that they are being redirect to the 404 page. I understand that redirection has to occur but my questions are these: Is it ok to redirect twice with 301 - so old URL to the "p" version then to final simplified version. Will link juice be lost doing this twice? If he redirects from the original URLS to the final version missing out the "p" version, what should happen to the "p" version - they are currently indexed. Any help would be appreciated. Thanks
Technical SEO | | AL123al0 -
Which URL structure holds the best SEO value?
Hello Community! We are rewriting URLs to better rank and provide better visual usability to our visitors. Would it be better to serve a URL that looks like this: www.domain.com/category-subcategory or www.domain.com/category/subcategory Please note the slight difference--the 2nd URL calls out a category that has a subcategory under it. Would it give us more value? Does it make a difference? Thanks in advance!
Technical SEO | | JCorp0 -
Spider Indexed Disallowed URLs
Hi there, In order to reduce the huge amount of duplicate content and titles for a cliënt, we have disallowed all spiders for some areas of the site in August via the robots.txt-file. This was followed by a huge decrease in errors in our SEOmoz crawl report, which, of course, made us satisfied. In the meanwhile, we haven't changed anything in the back-end, robots.txt-file, FTP, website or anything. But our crawl report came in this November and all of a sudden all the errors where back. We've checked the errors and noticed URLs that are definitly disallowed. The disallowment of these URLs is also verified by our Google Webmaster Tools, other robots.txt-checkers and when we search for a disallowed URL in Google, it says that it's blocked for spiders. Where did these errors came from? Was it the SEOmoz spider that broke our disallowment or something? You can see the drop and the increase in errors in the attached image. Thanks in advance. [](<a href=)" target="_blank">a> [](<a href=)" target="_blank">a> LAAFj.jpg
Technical SEO | | ooseoo0 -
Approved Word Separators in URLs
Hi There, We are in the process of revamping our URL structure and my devs tell me they have a technical problem using a hyphen as a word separator. There's a whole lot of competing recommendations out there and at this point I'm just confused. Does anyone have any idea what character would be next-best to the hyphen for separating words in a URL? Any reason to prefer one over another? Some links I've found discussing the topic: This page says that "__Google has confirmed that the point (.), the comma (,) and the hyphen (-) are valid word separators in URL’s.": http://www.internetofficer.com/seo/google-word-separator/ This page suggests the plus (+) symbol would be best: http://labs.phurix.net/posts/word-separators-in-urls This guy says he's tested and there's a whole bunch of symbols that will work as word separators: http://www.webproguide.com/articles/Symbols-as-word-separators-a-look-inside-the-search-engine-logic/ I'm leaning towards the tilde (~) or the plus (+) sign. Usage would be like so: http://www.domain.com/shop/sterling~silver OR /shop/sterling+silver etc... Thanks in advance for your help!
Technical SEO | | Richline_Digital1 -
Redirect everything from a certain url
I have a new domain (www.newdomain.com) and and an old domain (www.olddomain.com). Currently both domains are pointing (via dns nameserves) at the new site. I want to 301 everything that comes from the www.oldsite.com to www.newsite.com. I've used this htaccess code RewriteEngine On RewriteCond %{HTTP_HOST} !^www.newsite.com$
Technical SEO | | EclipseLegal
RewriteRule (.*) http://www.newsite.com/$1 [R=301,L] Which works fine and redirects if someone visits www.olddomain.com but I want it to cover everything from the old domain such as www.olddomain.com/archives/article1/ etc. So if any subpages etc are visited from the old domain its redirected to the new domain. Could someone point me in the right direction? Thanks0