Does a URL forward slash break up an exact match phrase?
-
I've seen some organisations implementing keyword phrases per URL with forward slashes in-between the keywords. Would this still work for broad AND exact match keywords when the search engine references the URL?
Here's an example with the keyword being "scuola di lingue tedesco".
http://www.esl.ch/it/adulti/scuola-di-lingue/tedesco/index.htm
Thanks in advance.
-
Thanks Matthew.
-
Google would probably look at that as an 'exact match' but bare in mind that this isn't really the be-all and end-all of your SEO. Google is weakening the value given to matches like this so I wouldn't worry too much. Focus on your links
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
titles length, URL length and meta descriptions on a subdomain effecting SEO on main domain?
Hi all, I am currently evaluating areas for optimization on my main domain. When doing this, Moz has identified multiple titles and urls that should be shortened and missing meta descriptions on my subdomain (a help center of sorts). As far as I am aware, we have not set up any "no-index" rules for this subdomain. Are these items affecting SEO on my main domain? Thanks,
On-Page Optimization | | annegretwidmer
Kasey0 -
How would you improve our URL structure?
Hi Mozzers, I have a question about the URL structure on our website (www.ikwilzitzakken.nl). We now have a main category with "zitzakken" (beanbags). We also have different brands, types and colours. Now we have URL's like this: <a>https://www.ikwilzitzakken.nl/zitzakken/vetsak/vetsak-fs600-flokati-zitzak/_381_w_3544_3862_NL_1</a> which seems long and not clean. Please don't look at the query at the end, we can't do anything about that in our CMS. In english this would be: https://www.iwantbeanbags.nl/beanbags/vetsak/vetsak-fs600-flokati-beanbag/_381_w_3544_3862_NL_1 How would you optimise this? We do have good rankings (this one ranks #1 for example), but I think our overall structure could be way better. Would love your thoughts about this.
On-Page Optimization | | TheOnlineWarp0 -
I have a duplicate URL from example.html to example without .html
I've recently changed my links from example.html to just example, however, moz shows that its been duplicated. Is this effects my ranking? if yes, how i can fix it please?
On-Page Optimization | | aptustelecom0 -
Creating Authority and choosing URL's
Creating Domain Authority and choosing URL's: A: What is better if you want to get higher Domain Authority? Choose keyword.domain.com or www.domain.com/keyword when other sites link to it? B: And for Page Authority? Choose keyword.domain.com or www.domain.com/keyword? Thanks!
On-Page Optimization | | HMK-NL0 -
Is it possible to have the crawler exclude urls with specific arguments?
Is it possible to exclude specific urls in the crawl that contain certain arguments - like you can do in google webmaster tools?
On-Page Optimization | | djangojunkie0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
Close URL owned by competitors.
The following example is exactly analogous to our situation (site names slightly altered😞 We own www.business-skills.com. It's our main site. We don't own, and would rather avoid paying for, www.businessskills.com. It's a parked domain and the owners want a very large sum for it. We own www.business-skills.co.uk and point it to our main site. We don't own www.businessskills.co.uk. This is owned by our biggest competitor. We also own www.[ourbrand].com and .co.uk, and point them to the main site. My question is - how much traffic do you think we may be missing due to these nearly-but-not-quite URL matches? Does it matter in terms of lost revenue? What sort of things should I be looking at to get a very rough estimate?
On-Page Optimization | | JacobFunnell0 -
Can you optimize for 2 keywords per URL?
Or should you just stick to 1 page, 1 keyword all the time? If you do 2, are there any things you should watch out for? Thanks
On-Page Optimization | | inhouseninja0