Best url structure
-
I am making a new site for a company that services many cities.
I was thinking a url structure like this,
website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5.
Will this be the best approach to optimize the site for the keyword plus 5 different cities ?
as long as I keep the total url characters under the SeoMoz reccomended 115 characters ?
Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.
-
Joseph
I'd make each page totally unique to the city. Don't worry about/focus on getting a penalty or not. Make the pages so useful for the user that if you have a sentence or two that's similar you won't get a penalty. Useful to me would be information that's entirely specific to what the page is "about". A page with some testimonials and an in-depth case study, photos and useful info in regard to that location should really deliver and give people what they need to know about the services in that location!
-Dan
-
Hello Handcrafter,
thank you for your reply.
We have one long tail keyword per page but service about 15 cities with only 3 offices.
So we need to attract business from multiple cities is my dilema.
Just trying to get advice how to do this the best way.
Having one main service page and then listing all cities served in the description and content.
Or make a page for each city with case studies and testimonials unique to the city but same basic description of the service.
In your experience how much content would make it seem like duplicate content ?
-
Hello dan,
Thank you.
We do have many testimonials and case studies for each city which would be unique content for the city.
So we could describe the service in general which would have to be basically the same, but could add 2 or 3 testimonials and 5 or 6 case studies related to that service but for that cities customers.
Is that first description going to trigger a duplicate content issue ?
Would that be based on percentage of total content or having one paragraph basically the same be penalized ?
-
Thank you for all the input, some great help for sure.
After reading some answers, I was thinking to myself, what was I thinking...lol
I should have been a little clearer about the keywords.
Keyword 1, 2 and 3 are one long tail keyword not 3 separate keywords.
Example shiny blue widget
My client has a company that travels across a big area to do service calls for this shiny blue widget.
So we want the company to rank in multiple cities. There are 3 offices servicing 15 cities.
@mat - I totally agree about the URL.
-
Right on the money Matt, thank you!
So now the challenge is somehow creating separate pages about each location that are not duplicates and are actually useful.
Not sure what the niche is, but let's say you're targeting a city the business is not actually located in. How could you make a page about that? You could make a page all about that town with case studies of clients from that town. With testimonials of customers from that town. The company could hold or sponsor an event in that town. Or a piece like "The State Of [company type] in [town]". Perhaps that may get some ideas going.
-Dan
-
Hi Joseph- What will people be searching for? Does each keyword have it's own unique focus? It seems you may want to create unique pages covering each keyword or group of keywords that stand alone. Within that content you might list locations or services unique to each city. If the services are all the same, could you make 1 unique page describing those services? That way you could simply list the cities and use text links to the services page. Good luck- Handcrafter
-
If you saw that address in a search result would you actually click it?
I'd say cramming that many keywords in to a URL would send a bad signal anyway. More to the point though it is going to look like someone jumped in Marty McFly's Delorean and came back with a boot full of spam from the late 1990s.
Ranking is NOT the most important thing (even if this would help, which I would doubt). If the listing looks poor quality then that ranking will bring less traffic. Less traffic means less money.
I would much rather see a short URL without the keywords, and use the keywords in the title. Better still break it up in to every page that makes logical sense and have an appropriate URL and matching content for each. There is no "trying" to avoid duplicate content though - you have to avoid it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help with facet URLs in Magento
Hi Guys, Wondering if I can get some technical help here... We have our site britishbraces.co.uk , built in Magento. As per eCommerce sites, we have paginated pages throughout. These have rel=next/prev implemented but not correctly ( as it is not in is it in ) - this fix is in process. Our canonicals are currently incorrect as far as I believe, as even when content is filtered, the canonical takes you back to the first page URL. For example, http://www.britishbraces.co.uk/braces/x-style.html?ajaxcatalog=true&brand=380&max=51.19&min=31.19 Canonical to... http://www.britishbraces.co.uk/braces/x-style.html Which I understand to be incorrect. As I want the coloured filtered pages to be indexed ( due to search volume for colour related queries ), but I don't want the price filtered pages to be indexed - I am unsure how to implement the solution? As I understand, because rel=next/prev implemented ( with no View All page ), the rel=canonical is not necessary as Google understands page 1 is the first page in the series. Therefore, once a user has filtered by colour, there should then be a canonical pointing to the coloured filter URL? ( e.g. /product/black ) But when a user filters by price, there should be noindex on those URLs ? Or can this be blocked in robots.txt prior? My head is a little confused here and I know we have an issue because our amount of indexed pages is increasing day by day but to no solution of the facet urls. Can anybody help - apologies in advance if I have confused the matter. Thanks
Intermediate & Advanced SEO | | HappyJackJr0 -
URL Construction
Working on an old site that currently has category urls (that productively rank) like this example: LakeNameBoating.com/category/705687/rentals I want to enhance the existing mid page one rank for terms related to "Lake Name Boat Rentals," 301ing the old urls to the new, would you construct the new urls as: LakeNameBoating.com/lake-name-boat-rentals or... LakeNameBoating.com/boat-rentals And why? It's all for one particular lake with "name" being just an anonymous placeholder example. Thanks!
Intermediate & Advanced SEO | | 945010 -
What are the best practices for microdata?
Not too long ago, Dublin Core was all the rage. Then Open Graph data exploded, and Schema seems to be highly regarded. In a best-case scenario, on a site that's already got the basics like good content, clean URLs, rich and useful page titles and meta descriptions, well-named and alt-tagged images and document outlines, what are today's best practices for microdata? Should Open Graph information be added? Should the old Dublin Core be resurrected? I'm trying to find a way to keep markup light and minimal, but include enough microdata for crawlers to get a better sense of the content and its relationships to other subdomains and sites.
Intermediate & Advanced SEO | | WebElaine0 -
URL Optimisation Dilemma
First of all, I fully appreciate that I may be over analysing this, so feel free to highlight if you think I’m going overboard on this one. I’m currently trying to optimise the URLs for a group of new pages that we have recently launched. I would usually err on the side of leaving the urls as they are so that any incoming links are not diluted through the 301 re-direct. In this case, however, there are very few links to these pages, so I don’t think that changing URLs will harm them. My main question is between short URLs vs. long URLs (I have already read Dr. Pete’s post on this). Note: the URLs I have listed below are not the actual URLs, but very similar examples that I have created. The URLs currently exist in a similar format to the examples below: http://www.company.com/products/dlm/hire-ca My first response was that we could put a few descriptive keywords in the url, with something like the following: http://www.company/products/debt-lifecycle-management/hire-collection-agents - I’m worried though that the URL will get too long for any pages sitting under this. As a compromise, I am considering the following: http://www.company/products/dlm/hire-collection-agents My feeling is that the second approach will give the best balance between having the keywords for the products and trying to ensure good user experience. My only concern is whether the /dlm/ category page would suffer slightly, but this would have ‘debt-lifecycle-management’ in the title tag. Does this sound like a good approach to people? Or do you think I’m being a little obsessive about this? Any help would be appreciated 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
301 forwarding old urls to new urls - when should you update sitemap?
Hello Mozzers, If you are amending your urls - 301ing to new URLs - when in the process should you update your sitemap to reflect the new urls? I have heard some suggest you should submit a new sitemap alongside old sitemap to support indexing of new URLs, but I've no idea whether that advice is valid or not. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
What is the best way to link between all my portals?
Hi I own 12 different portals within gambling, they do more or less work and feel like this one, Casinotopplisten, what is the best way for me to link between all of them? Since there is alot going on in Google these days I havent linked between the sites at all, but i feel that to be a somewhat waste. So here is my three ideas so far, in ranked order: Add a menu at the topp right of the site, or footer, that links to the 10 different sites with different languages. The text link should then only be "Norwegian, Swedish, English etc.." Basiclly the same as about, but in addition linking to the "same page" in the other languages. As all pages have the same article set for startes this can be done. Dont do any linking between the sites and only link to the sites separately from our company blog/site.. Dont link at all. I should add that all of these sites are on different IPs with different domains and all in different languages. Hope someone can add their 2c on this one.. Thanks!
Intermediate & Advanced SEO | | MortenBratli0 -
What are the best suites of SEO tools?
I normally use SEOmoz and a bit of SEMrush but I dont really know much outside of those two. Im looking to do a review of the big, trustworthy ones - along the lines of free trial price vs value ranktracking linkbuilding help onpage analysis and help competitor analysis reports I heard good things about Raven Tools and Web CEO. Ive seen mention of SEOpowersuite on this forum but the site looks spammy as hell Anyone have a view on those 5 tools or any others in a similar vein? Or any other top line criteria I should be looking at? Cheers
Intermediate & Advanced SEO | | firstconversion
Stephen1 -
New AddThis URL Sharing
So, AddThis just added a cool feature that attempts to track when people share URL's via cutting and pasting the address from the browser. It appears to do so by adding a URL fragment on the end of the URL, hoping that the person sharing will cut and paste the entire thing. That seems like a reasonable assumption to me. Unless I misunderstand, it seems like it will add a fragment to every URL (since it's trying to track all of 'em). Probably not a huge issue for the search engines when they crawl, as they'll, hopefully, discard the fragment, or discard the JS that appends the fragment. But what about backlinks? Natural backlinks that someone might post to say, their blog, by doing exactly what AddThis is attempting to track - cutting and pasting the link. What are people's thoughts on what will happen when this occurs, and the search engines crawl that link, fragment included?
Intermediate & Advanced SEO | | BedeFahey0