Were our URLs setup correctly?
-
The person who build our site setup a lot of the pages like: domain/location/city/title tag For example: http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/
I know the length is too long and it seems entirely unnecessary to me. Many of the pages I have created since I got here are just domain/title tag (which is almost always city-field of law-attorneys-lawyers). However, when I compare the original pages with the new ones, they both rank similarly. Given what a pain it is to change urls, I'm not sure if it would be worth it to shorten them all or not. However, I would like to know if the way there were setup originally makes sense for some reason I don't understand.
Thanks,
Ruben
-
Thanks everyone! I think I'll go ahead and try to make the change for at least some of the urls that aren't performing well to see if shortening them helps. I'll report back what I find.
Thanks again!
Ruben
-
Hi Ruben
Key word stuffing is a big no no well documented by Google. It would be a wise decision to first shorten the link and to make sure the link is relevant to your page (just be natural). If you are registered on google places, your registered address will get picked up in the Tampa area.
go from this http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/
to this
http://www.kempruge.com/personal-injury-legal-attorneys/
If you have had this long link appearing for a while - no love from Google and i would strongly advise your change this.
Hope this helps?
Gary
-
Just popping in to add my voice to the choir. I agree with the consensus here that the URLs are too long. You are smart to be concerned about overoptimization, and I think it would be worth the effort to create new URLs and redirect the old ones.
-
You are right! The URLs are too long and unnecessary!
The shorten URL would make more sense to users as well as search engines. The current version seems like you are trying to stuffed the keywords within the URL where as shorting them will kill this factor and it will look more natural to SEs as well as end user.
My advice would be to go with shortening them and make it look real as stuffing is something Google don’t like so you might get hurt in the longer run!
Hope this helps!
-
Ruben,
I'd agree with your assessment that those URL formats are too long and unnecessary. This URL structure looks a lot like keyword stuffing and EMD (exact-match domain) as well as PMD (partial-match domain) were valued by the people who made the website. In their defense, depending on how old the website is, those extra keywords may have actually helped the pages rank better for relevant queries years ago.
I wouldn't worry too much about redirecting those URLs or changing them today, however. I suppose you could but today search engines are far more sophisticated. I don't think it'd be a great investment of your time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to redirect URLs with a hash-bang (#!) format?
Hi Moz, I'm trying to redirect www.site.com/locations/#!city to www.site.com/locations/city. This seems difficult because anything after the hash character in the URL does not make it to the server thus cannot be parsed for rewriting. Is there an SEO friendly way to implement these redirects? Thanks for reading!
Web Design | | DA20130 -
Bizarre PDF URL string
Hey folks, I'm getting literally hundreds of Duplicate Title and Duplicate Content errors for a site, and most of them are a result of the same issue. The site uses javascript container pages a lot, but each gets their own URL. Unfortunately, it seems like each page is also loading all the content for all the other pages, or something. For instance, I have a section of the site under /for-institutions/, and then there are 5 container pages under that. Each container page has it's own URL, so when you select it, you get the URL /for-institutions/products/ or /for-institutions/services/ etc. However, the institutions container page doesn't change, just the content within. In my SEO results, I'm getting the following: /for-institutions/$%7Bpdf%7D/ /for-institutions/$%7Bpdf%7D/$%7Bpdf%7D/ etc, each as a duplicate title and content page. How can I eliminate this? Is there a regular expression that rewrites URL segments beginning with $ ? For your reference: The page is set up so that any URL that doesn't exist just refers to the subdirectory. /for-institutions/$%7Bpdf%7D/ displays /for-institutions/, but does not rewrite the URL. So too if I were to enter /for-institutions/dog.
Web Design | | SpokeHQ0 -
Yes or No for Ampersand "&" in SEO URLs
Hi Mozzers I would like to know how crawlers see the ampersand (& or &) in your URLs and if Google frown upon this or not? As far as I know they purely recognise this as "and" is this correct and is there any best practice for implementing this, as I know a lot of people complained before about & in links and that it is better to use it as &, but this is not on links, this is on URLs. Reason for this is that we looking to move onto an ASP.Net MVC framework (any suggestions for a different framework are welcome, we still just planning out future development) and in order to make use of the filter options we have on our site we need a parameter to indicate the difference on a routing level (routing sends to controller, controller sends to model, model sends to controller and controller sends to view < this is pattern of a request that comes in on the framework we will be using). I already have -'s and /'s in the URLs (which is for my SEO structuring) so these syntax can't be used for identifying filters the user clicks or uses to define their search as it will create a complete mess in the system. Now we looking at & to say; OK, when a user lands on /accommodation and they selects De Kelders (which is a destination in our area) the page will be /accommodation/de-kelders on this page they can define their search further to say they are looking for 5 star accommodation and it should be close to the beach, this is where the routing needs some guidance and we looking to have it as follow: /accommodation/de-kelders/5-star&close-to-the-beach. Now, does the "&" get identified by search engines on a URL level as "and" and does this cause any issues with crawling or indexation or would it be best to look at another solution? Thanks, Chris Captivate
Web Design | | DROIDSTERS0 -
Google fails to pick out the correct URL of the story
Hi , I have a page with many news storeys on it. Google craws the page but it picks up a more general url even though I've embedded the direct URL within anchor tags around the headline . The snippet below got linked by Google to http://www.irishnews.com/ Any idea how i can get Google to pick-up http://www.irishnews.com/news.aspx?storyId=1180708 would be very welcome Peter Quinn: Family made scapegoats of financial crisis News Peter Quinn: Family made scapegoats of financial crisis THE Quinn family have been made scapegoats of the financial crisis surrounding the former Anglo Irish Bank, tycoon Sean Quinn's brother Peter claimed yesterday.Peter Quinn, a former president of the GAA, said hi read more»
Web Design | | Liammcmullen0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0 -
Has Anyone Had Issues With ASP.NET 4.0 URL Routing?
I'm seeing some odd results in my SEOMOZ results with a new site I just released that is using the ASP.NET 4.0 URL routing. I am seeing thousands(!) of duplicate results, for instance, because the crawl has uncovered something like this: http://www.mysite.com/
Web Design | | TroyCarlson
http://www.mysite.com/default.aspx (so far, so good, though I wish it wouldn't show both)
http://www.mysite.com/default.aspx/about/ (what the heck -?)
http://www.mysite.com/default.aspx/about/about/ (WTF!?)
http://www.mysite.com/default.aspx/about/about/products/ (and on and on ad infinitum) I'm also seeing problems pop up in my sitemap because extensionless urls have an odd "eurl.axd/abunchofnumbersgohere" appended to the end of every address which is breaking links. sigh Buyer beware. I've found articles that discuss the "eurl.axd" issue here and there (this one seems very good), but nothing about the weird crawl issue I outlined above. Any advice?0 -
Page Title or Search Friendly Urls?
We are currently auditing our website as part of our SEO strategy. One item which hascome up is the importance of search friendly urls against the search engine friendly page titles. Do url's or page titles carry more relevance than the other in search engines? Obviously the ideal would be to have both to maximise search impact but do either carry more importance. Thanks
Web Design | | bwfc770 -
Correct use for Robots.txt
I'm in the process of building a website and am experimenting with some new pages. I don't want search engines to begin crawling the site yet. I would like to add the Robot.txt on my pages that I don't want them to crawl. If I do this, can I remove it later and get them to crawl those pages?
Web Design | | EricVallee340