URL in SERP: Google's stand
-
Months back, we can notice "keyword" will be bold and highlighted if its in the SERP URL. Now Google no more highlights any URLs even with exact match of keyword we search. Beside UI, Does this mean Google might devalued or reduced the importance of URL as ranking factor? We can see many search results match partially or completely in URL with search keywords.
-
Google is more interesting in the information you present users with and whether or not you can answer their query. Highlighting keywords may, to those not in the know, show that a website is more likely to answer a query, but in reality it may not be the case.
It's falls under the same method of Google attributing actual meaning, for example "wood floor" could be perceived as "wooden floor" or "wood flooring" and so on, where those websites might help the user more so than the one which contains the exact keyword in the URL.
-
Hi
Google no more highlights even title but it is a ranking factor, highlights only words in description but that is not a direct ranking factor. So If keyword in url is not highlighting that doesn't mean Google devalued importance of URL yes but Google said on 9th of March 2017 that keywords in URLs are overrated for Google SEO.recently
URLs are a minor ranking factor search engines use when determining a particular page or resource's relevance to a search query.
I personally use keywords in URL.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same Meta description is being shown on Google?
Not sure why this is happening but when you this command into Google site:"mywebsite": + "key phrase" It brings up pages from my website which have the key phrase but I have noticed that Google is using the wrong meta description for all of them even though these pages all have their own unique meta description Does anyone know why this would be happening? Thanks
Algorithm Updates | | webguru20140 -
Traffic drop only affecting google country domains
Hello, I have noticed that our our traffic is down by 15% (last 30 days to the 30 days before it) and I dug deeper to figure out whats going on and I am not sure I understand what is happening. Traffic from google country domains( for example google.com.sa) dropped by 90% on the 18th of September, same applies to other country specific domains. Now my other stats (visits organic keywords, search queries in WMT) seem to be normal and have seem some decrease (~5%) but nothing as drastic as the traffic drop from the google country domains. Is this an https thing that is masking the source of the traffic that came into effect on that date? Is the traffic that is now missing from google country domains being reported from other sources? Can anyone shed some light on what is going on? qk0CS7X
Algorithm Updates | | omarfk0 -
De-indexed homepage in Google - very confusing.
A website I provide content for has just suffered a de-indexed homepage in Google (not in any of the other search engines) - all the other pages remained indexed as usual. Client asked me what might be the problem and I just couldn't figure it out - no linkbuilding has ever been carried out so clean backlink profile, etc. I just resubmitted it and it's back in its usual place, and has maintained the rankings (and PR) it had before it disappeared a few days ago. I checked WMT and no warnings or issues there. Any idea why this might've happened?
Algorithm Updates | | McTaggart0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
How Do I Make My Google SERP "SiteLinks" more relevant?
I have a shopping website with thousands of products, and the sitelinks that google has chosen for me (for a long time) are random product pages, which makes no sense to me. I do not emphasize those products on the home page, and I have a sitemap that clearly lists the directory of all the categories. I also added a "nofollow" attribute to almost every link on the home page that is not important. These products in the site links seem completely random and there isnt even a sitelink for "about" or any of the footer content! What gives? Also, my sitelinks never updated to the new, better version. Any suggestions?
Algorithm Updates | | cDNAInteractive0 -
Google personalize search results ...
Hi cant find the right term or word for it but google seems to personalize my search results according to my previous searches so that the rankings i get for a certain term isnt correct. Can i turn that off somehow ?
Algorithm Updates | | danlae0 -
Do we have a timeline of google, bing updates
I thought it would be handy if we had a timeline with dates of any updates to the algo's.
Algorithm Updates | | AlanMosley
Does one exists here at SEOMoz or elsewhere.
Thanks3 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0