URL Names not so important in future?
-
I read somewhere (hard to say where with all the information about SEO and google!) that in the future, Google will put less importance on the URL name for ranking purposes. Any thoughts?
-
URLs will always be important in lots of ways;
- They are seen by users in many cases (as links, in social media and even the SERPs) and can help improve CTR.
- They are extremely important to the overall site architecture - and especially for information architecture.
- They can often become the anchor text in links.
- They help a user when browsing the site, as if they represent the structure or "level" depth (almost like breadcrumbs) is can make the navigating experience easier.
I don't think URLs are something Google looks at in a simplistic way like "oh you're URL says X keyword, we're gonna weigh that keyword more" - I think all of these factors feed into the importance of URLs and I don't see that value diminishing anytime soon.
-Dan
-
I think ever since firefox came out with the amazing bar and google became the new http they have had to rank domain names high in the SERPS, as there is always a chance that that is what you are looking for if you put it into your browser address bar.
However, it is not beyond thinking that google will rank things differently depending on how you search. Searching on a mobile phone provides different results to searching on your desktop. So why not searching from your browser location bar giving different results to searching from the google homepage? With the google homepage giving less relevance to domain names as a signal.
I am sure Google are more than capable of being data led and seeing if those things correlate and making the necessary changes.
Other ways in which domain names become less relevant are through such things as apps and social media games. Take instagram as an example. An online business that made its name before it even had a domain name. It was a solid app, searched for and shared through the mobile phone market places and then via facebook.
With the battle from facebook to essentially become your desktop and the entire internet - like AOL tried doing many years back, in which everything should be run through their systems and their framework I hope that domain names do not become obsolete as the prospect scares me a little.
First time posting here.
Figured I had to start somewhere.
-
Im sure it is possible it could devalue down the road, but Title, URL and Body are one of the easiest ways to identify if the content is relevant. Google also takes in account bounce rate so if 'title', 'url' and 'body' are targeting 'shoes', but your website is about garage install it will be able to tell immediately that your site is not relevant.
-
I can't speak for the future but late last year I changed the url structure of my site and saw a pretty significant improvement in my rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Google's "Temporarily remove URLs" in search console works?
Hi, We have created new sub-domain with new content which we want to highlight for users. But our old content from different sub-domain is making top on google results with reputation. How can we highlight new content and suppress old sub-domain in results? Many pages have related title tags and other information in similar. We are planing to hide URLs from Google search console, so slowly new pages will attain the traffic. How does it works?
Algorithm Updates | | vtmoz0 -
Adding non-important folders to disallow in robots.txt file
Hi all, If we have many non-important folders like /category/ in blog.....these will multiply the links. These are strictly for users who access very rarely but not for bots. Can we add such to disallow list in robots to stop link juice passing from them, so internal linking will me minimised to an extent. Can we add any such paths or pages in disallow list? Is this going to work pure technical or any penalty? Thanks, Satish
Algorithm Updates | | vtmoz0 -
Importance of Links for Local Search
**According to an article about the "no no's for local SEO" links are not very important. Here is an excerpt: "**Local SEO is very different when compared to traditional SEO. The importance of backlinks in local SEO isn’t as important. In other words, links simply don’t matter as much when it comes to local SEO. Googles’ local search algorithm treats links completely differently than its standard algorithm." How accurate is this statement? Wouldn't more links help your local pages rank better in non-local organic results such as the results outside of the new carousel?
Algorithm Updates | | pbhatt0 -
Should We Switch from Several Exact Match URLs to Subdomains Instead?
We are a company with one product customized for different vertical markets. Our sites are each setup on their own unique domains:
Algorithm Updates | | contactatonce
contactatonce.com (Brand)
autodealerchat.com (Auto Vertical)
apartmentchat.com (Apartment Vertical)
chatforrealestate.com (Real Estate Vertical) We currently rank well on the respective keyword niches including:
- auto dealer chat (exact match), automotive chat, dealer chat
- apartment chat (exact match), property chat, multifamilly chat
- chat for real estate (exact match), real estate chat To simplify the user experience we are considering moving to a single domain and subdomain structure: contactatonce.com
auto.contactatonce.com
apartment.contactatonce.com
realestate.contactatonce.com QUESTIONS:
1. Considering current Google ranking strategies, do we stand to lose keyword related traffic by making this switch?
2. Are there specific examples you can point to where an individual domain and subdomains each ranked high on Google across a variety of different niches? (I'm not talking about Wikipedia, Blogger, Blogspot, Wordpress, Yahoo Answers, etc. which are in their own class, but a small to mid size brand). Thank you,
Aaron0 -
SERP Rankings: Breadcrumb appears near URL
Hi mozzers, I was checking at the "carpet cleaning" kw national search and an usual result appeared(image attached): -Title Tag -Url + Breadcrumbs following The Breadcrumb showing up near the url is the first time I see that happening! Anyone has an idea why? Do you think it is a Google new trick or do you guys think it is the webmaster who added a hack to it? Thanks for letting me know Tf52L.png
Algorithm Updates | | Ideas-Money-Art0 -
Infographics Links could get discounted in the future
Hey guys, I read this article this morning on SEL. Not sure what to think about it.. Matt did have a point that a lot of infographics are of bad quality (even with wrong information present at times) , and hence don't deserve to gain links from it. But how could Google possible know whether the infographic itself is of high quality or not?? http://searchengineland.com/cutts-infographic-links-might-get-discounted-in-the-future-127192
Algorithm Updates | | Michael-Goode0 -
Is URL appearance defined by crawling or by XML sitemap
I am having a problem developing a sitemap because I have long URLs that are made by zend. They go like this: http://myagingfolks.com/professionals/20661/social-workers/pennsylvania-civi-stanger Because these URL's are long and are fed by Zend when I try to call them all up, to put on the sitemap, the system runs out of memory and crashes. Do you know what part of a search result, in google, say, comes from the URL? Would it be fine for me to submit to google only www.myagingfolks.com/professionals/20661. Does the crawler find that the URL is indeed http://myagingfolks.com/professionals/20661/social-workers/pennsylvania-civi-stanger or does it go with just what the sitemap tells it?
Algorithm Updates | | Jordanrg0 -
Google changing case of URLs in SERPs?
Noticed some strange behavior over the last week or so regarding our SERPs and I haven't been able to find anything on the web about what might be happening. Over the past two weeks, I've been seeing our URLs slowly change from upper case to lower case in the SERPs. Our URLs are usually /Blue-Fuzzy-Widgets.htm but Google has slowly been switching them to /blue-fuzzy-widgets.htm. There has been no change in our actual rankings nor has it happened to anyone else in the space. We're quite dumbfounded as to why Google would choose to serve the lower case URL. To be clear, we do not build links to these lower case URLs, only the upper. Any ideas what might be happening here?
Algorithm Updates | | Natitude0