URL construction in 2014
-
Hey guys,
I was wondering if you could tell me your thoughts about how a URL is perceived by the algo in 2014?
For example:
http://www.moneyexpert.com/reviews/credit-cards/amex-platinum/
and lets say
http://www.moneyexpert.com/reviews_credit-cards_review_amex-platinum.html
In the eyes of google do both different style of url generally help google understand the same result? or will the keyword rich html url have a bigger benefit?
I am looking forward to your advice on this matter. I don't plan on doing a lot of SEO but rather letting nature take its course so to speak... so i just wanted to make sure i construct this site with 'best practice'.
-
Yes don't use underscores. Underscores make your url looks spammy and so do long urls.
Go for the first example for sure.
-
Thanks very much for responding.
I look forward to hearing a few more opinions on this if anyone else can help please.
-
Always use hyphens instead of underscores, There have been numerous test that show the underscores are not a good signal for a separate word. This has also been mentioned by a few people at Google in random videos in the past from what I remember. However Google's John Mueller has said recently that the URL does not even matter and they don't use it as a signal, I call BS on that for now and would 100% go the first one.
Also imagine a situation where you may need to take action on a whole directory for some reason. I would make sure your url looked like the first one so that you could use robots.txt or GWT services to remove or make changes to a specific set of pages all in one go.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL structure for new product launch
Hello, I work for a company (let's call it companyX) that is about to launch a new product, lets call it ProductY. www.CompanyX.com is an old domain with a good domain authority. The market in which ProductY is being launched is extremely competitive. The marketing department want's to launch ProductY on a new website at www.ProductY.com.
Intermediate & Advanced SEO | | Lvet
My opinion is that we should instead create a subfolder with product information at www.CompanyX.com/ProductY. By doing this we could leverage on the existing domain authority of CompanyX.com Additionally for campaigns, and in order to have a more memorable URL we could use ProductY.com with a 301 redirect to www.CompanyX.com/ProductY What do you think is the best strategy from an SEO point of view? Cheers
Luca0 -
Does rewriting a URL affect the page authority?
Hi all, I recently optimized an overview page for a car rental website. Because the page didn’t rank very well, I rewrote the URL, putting the exact keyword combination in it. Then I asked Google to re-crawl the URL through Search Console. This afternoon, I checked Open Site Explorer and saw that the Page Authority had decreased to 1, while the subpages still have an authority of about 18-20. Hence my question: is rewriting a URL a bad idea for SEO? Thank you,
Intermediate & Advanced SEO | | LiseDE
Lise0 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
HTTPS entire domain Vs. one URL
Long time no Moz! Ive been away with some server related issues, installing an AD at the company I work for, but I'm back. Our SSL cert just expired and I'm trying to determine the pros and cons of making an entire site SSL vs just the URL. Our previous set up was just a single domain. I know Google has hinted toward SSL preference, and I know its a little early to know for certain how much that's going to help, but I just wanted to know what everybody thought? It expired yesterday, so I have to do something. And we lost our previous credentials so I can't just renew the old one. Thanks!
Intermediate & Advanced SEO | | HashtagHustler0 -
Construction website
Hi, I have a construction website that is aimed at tradesmen. There are 2 goals of the site: 1. To allow potential customers to sign up for a trade account. 2. To allow existing customers to access to products and login to their account to make an order. The site is full of categories and products which should be indexed so we rank for these trade products. The homepage redesign is where i am having an issue: Currently the site is set up like a standard retail site but without prices, which are viewable only when logged in. The homepage is designed such that there is several call to actions about promotions, services and to apply for a trade account, that apply to both existing and potential customers. At the moment there is a poor conversion to get potential customers to apply for a trade account. This is because there is too much distraction away from this goal and they are allowed to engage other areas of the site freely. The main purpose of the homepage should be to encourage potential customers to sign up. The secondary purpose to for existing customers to access the accounts and products. I believe potential customers should not be exposed to the categories and products as it is a distraction from the primary goal. Potential customers, i.e. Tradesmen, would already have a certain understanding of the types of products we provide, so I don't feel it is necessary to allow them to crawl the rest of the site unless they have an account. What are your thoughts on that? Here is my lack of understanding: On the homepage, if I restrict access to categories and products to existing account holders only, where a login is required to proceed, would that mean Google cannot access these pages to index them? Or is this only controlled by NoFollows & Robots.txt? Obviously not indexing is undesirable. I do understand potential customers will need some information about our range of products but the idea is to coerce them to sign up for an account so they can see this information. The more information that is provided to a potential customer, the higher the probability a person can make a decision against applying for an account. Restricting access creates a motivator to reveal information and we capture their data to converse with them personally. This increases the probability of us being able to retain their interest by providing a customised service based on their needs. All of this I feel makes perfect sense to me, the only query/obstacle I have is the indexing of the site. If Google cannot index pages that are restricted by account access, then I would like suggestions to solve/compromise/optimise the above. Just to address the desired behaviour of index pages. If in search a our product page appears, the person clicking the link would either be redirected or exposed to a login or sign up screen to view. Thank you so much for your help. Antonio
Intermediate & Advanced SEO | | AVSFencingSupplies0 -
Best url structure
I am making a new site for a company that services many cities. I was thinking a url structure like this, website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5. Will this be the best approach to optimize the site for the keyword plus 5 different cities ? as long as I keep the total url characters under the SeoMoz reccomended 115 characters ? Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.
Intermediate & Advanced SEO | | jlane90 -
Changing a url from .html to .com
Hello, I have a client that has a site with a .html plugin and I have read that its best to not have this. We currently have pages ranking with this .html plug in. However If we take the plug in out will we lose rankings? would we need a 301 or something?
Intermediate & Advanced SEO | | SEODinosaur0 -
Does Google count links on a page or destination URLs?
Google advises that sites should have no more than around 100 links per page. I realise there is some flexibility around this which is highlighted in this article: http://www.seomoz.org/blog/questions-answers-with-googles-spam-guru One of Google's justifications for this guideline is that a page with several hundred links is likely to be less useful to a user. However, these days web pages are rarely 2 dimensional and usually include CSS drop--down navigation and tabs to different layers so that even though a user may only see 60 or so links, the source code actually contains hundreds of links. I.e., the page is actually very useful to a user. I think there is a concern amongst SEO's that if there are more than 100ish links on a page search engines may not follow links beyond those which may lead to indexing problems. This is a long winded way of getting round to my question which is, if there are 200 links in a page but many of these links point to the same page URL (let's say half the links are simply second ocurrences of other links on the page), will Google count 200 links on the page or 100?
Intermediate & Advanced SEO | | SureFire0