Extensions Vs Non Extensions
-
Hello,
I'm a big fan of clean urls. However i'm curious as to what you guys do, to remove them in a friendly way which doesn't cause confusion.
Standard URLS
http://www.example.com/example1.html
http://www.example.com/example2.html
http://www.example.com/example3.html
http://www.example.com/example4.php
http://www.example.com/example5.phpWhat looks better (in my eyes)
http://www.example.com/example1/
http://www.example.com/example2/
http://www.example.com/example3/
http://www.example.com/example4/
http://www.example.com/example5/Do you keep extensions throughout your website, avoiding any sort of confusion and page duplication;
OR
Put a canonical link pointing to the extension-less version of each page, with the anticipation of this version indexing into Google and other Search Engines.
OR
301 Each page which has an extension to an extension-less version, and remove all linking to ".html" site wide causing errors within software like Dreamweaver, but working properly.
OR
Another way? Please emphasise
I'm sorry if this is a little vague and I appreciate any angles on this, I quite like clean url's but unsure a hassle-less way to create it.
Thanks for any advice in advance
-
Thanks for your answer on this,
I think this will be the right road to go down.
Alex
-
Hi,
Yes, 301s is the best way to get search engine forget about your past URLs.
.html are not good clean urls, so if you can, get rid of them
However, I cannot help you with Dreamweaver issues, I don't use it... Hope someone can help you with it.
Best,
-
Thanks Benoit!
I'm wanting to keep the clean urls, so would you say 301s are the best way to combat this? As well as deleting all of the ".html" references within my website? Even though it would throw up errors within Dreamweaver etc?
-
Hi,
Good point. As I see it, website visitors do not care about the extension ".something", this is just a development issue. So the best way is to avoid it, as people will remember the URL, not the extension.
Best is to use 301 redirection and keep clean URL, there is no need to keep duplicate pages.
Let me know if that helps
Best,
- Benoit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 vs 410 Across Search Engines
We are removing a large number of URLs permanently. We care about rankings for search engines other than Google such as Yahoo-Bing, who don't even list https status 410 code option: https://docs.microsoft.com/en-us/bingmaps/spatial-data-services/status-codes-and-error-handling Does anyone know how search engines other than Google handle 410 vs 404 status? For pages permanently being removed John Mueller at Google has stated "From our point of view, in the mid term/long term, a 404 is the same as a 410 for us. So in both of these cases, we drop those URLs from our index. We generally reduce crawling a little bit of those URLs so that we don’t spend too much time crawling things that we know don’t exist. The subtle difference here is that a 410 will sometimes fall out a little bit faster than a 404. But usually, we’re talking on the order of a couple days or so. So if you’re just removing content naturally, then that’s perfectly fine to use either one." Any information or thoughts? Thanks
Intermediate & Advanced SEO | | sb10300 -
Underscores, capitals, non ASCII characters in image URLs - does it matter?
I see this strangely formatted image URLs on websites time and again - is this an issue - I imagine it isn't best practice but does it make any difference to SEO? Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Panda, rankings and other non-sense issues
Hello everyone I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G: http://www.virtualsheetmusic.com/ Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while. Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it. Take for example this competitors of ours: http://8notes.com They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them. Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first. As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st. So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around. Thoughts???!!!
Intermediate & Advanced SEO | | fablau0 -
Can new domain extensions rank?
Hi Does anybody know if it's possible to get domains with extensions like .party or .world to rank? Even for high competitive keywords? Can they rank over .com?
Intermediate & Advanced SEO | | MikeWU0 -
Microsite Subfolder URL vs Redirected TLD for best SEO
We have a healthcare microsite that is in a subfolder off a hospital site.They wanted to keep their TLD and redirect from the subfolder URL. Even with good on-page SEO, link building, etc., they're not organically ranking as well as we think they should be. ie. They have http://our-business-name.com vs. http://hospital.org/our-business-name/ For best SEO value, are they better off having only their homepage as TLD and not redirect any interior pages but display as subfolder URL? ie. Keep homepage as http://our-business-name.com but use hospital urls for interior pages http://hospital.org/our-business-name/about/ Or is there some better way to handle this?
Intermediate & Advanced SEO | | IT-dmd0 -
URL Parameters as a single solution vs Canonical tags
Hi all, We are running a classifieds platform in Spain (mercadonline.es) that has a lot of duplicate content. The majority of our duplicate content consists of URL's that contain site parameters. In other words, they are the result of multiple pages within the same subcategory, that are sorted by different field names like price and type of ad. I believe if I assign the correct group of url's to each parameter in Google webmastertools then a lot these duplicate issues will be resolved. Still a few questions remain: Once I set f.ex. the 'page' parameter and i choose 'paginates' as a behaviour, will I let Googlebot decide whether to index these pages or do i set them to 'no'? Since I told Google Webmaster what type of URL's contain this parameter, it will know that these are relevant pages, yet not always completely different in content. Other url's that contain 'sortby' don't differ in content at all so i set these to 'sorting' as behaviour and set them to 'no' for google crawling. What parameter can I use to assign this to 'search' I.e. the parameter that causes the URL's to contain an internal search string. Since this search parameter changes all the time depending on the user input, how can I choose the best one. I think I need 'specifies'? Do I still need to assign canonical tags for all of these url's after this process or is setting parameters in my case an alternative solution to this problem? I can send examples of the duplicates. But most of them contain 'page', 'descending' 'sort by' etc values. Thank you for your help. Ivor
Intermediate & Advanced SEO | | ivordg0 -
Any arguments against eliminating all (non-blog) subfolders?
Short URLs seem to do the trick from a UX perspective. For example: /primary-care vs. /why/specialties/primary-care . This convention will be applied over 30-40 pages. Note that while "/why/specialties/primary-care" isn't terribly ugly, some of our pages would look a little overly-keywordy if we go with the subfolder approach.
Intermediate & Advanced SEO | | NueMD0 -
Google.ca vs Google.com Ranking
I have a site I would like to rank high for particular keywords in the Google.ca searches and don't particularly care about the Google.com searches (it's a Canadian service). I have logged into Google Webmaster Tools and targeted Canada. Currently my site is ranking on the third page for my desired keywords on Google.com, but is on the 20th page for Google.ca. Previously this change happened quite quickly -- within 4 weeks -- but it doesn't seem to be taking here (12 weeks out and counting). My optimization seems to be fine since I'm ranking well on Google.com: not sure why it's not translating to Google.ca. Any help or thoughts would be appreciated.
Intermediate & Advanced SEO | | seorm0