Best url structure
-
I am making a new site for a company that services many cities.
I was thinking a url structure like this,
website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5.
Will this be the best approach to optimize the site for the keyword plus 5 different cities ?
as long as I keep the total url characters under the SeoMoz reccomended 115 characters ?
Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.
-
Joseph
I'd make each page totally unique to the city. Don't worry about/focus on getting a penalty or not. Make the pages so useful for the user that if you have a sentence or two that's similar you won't get a penalty. Useful to me would be information that's entirely specific to what the page is "about". A page with some testimonials and an in-depth case study, photos and useful info in regard to that location should really deliver and give people what they need to know about the services in that location!
-Dan
-
Hello Handcrafter,
thank you for your reply.
We have one long tail keyword per page but service about 15 cities with only 3 offices.
So we need to attract business from multiple cities is my dilema.
Just trying to get advice how to do this the best way.
Having one main service page and then listing all cities served in the description and content.
Or make a page for each city with case studies and testimonials unique to the city but same basic description of the service.
In your experience how much content would make it seem like duplicate content ?
-
Hello dan,
Thank you.
We do have many testimonials and case studies for each city which would be unique content for the city.
So we could describe the service in general which would have to be basically the same, but could add 2 or 3 testimonials and 5 or 6 case studies related to that service but for that cities customers.
Is that first description going to trigger a duplicate content issue ?
Would that be based on percentage of total content or having one paragraph basically the same be penalized ?
-
Thank you for all the input, some great help for sure.
After reading some answers, I was thinking to myself, what was I thinking...lol
I should have been a little clearer about the keywords.
Keyword 1, 2 and 3 are one long tail keyword not 3 separate keywords.
Example shiny blue widget
My client has a company that travels across a big area to do service calls for this shiny blue widget.
So we want the company to rank in multiple cities. There are 3 offices servicing 15 cities.
@mat - I totally agree about the URL.
-
Right on the money Matt, thank you!
So now the challenge is somehow creating separate pages about each location that are not duplicates and are actually useful.
Not sure what the niche is, but let's say you're targeting a city the business is not actually located in. How could you make a page about that? You could make a page all about that town with case studies of clients from that town. With testimonials of customers from that town. The company could hold or sponsor an event in that town. Or a piece like "The State Of [company type] in [town]". Perhaps that may get some ideas going.
-Dan
-
Hi Joseph- What will people be searching for? Does each keyword have it's own unique focus? It seems you may want to create unique pages covering each keyword or group of keywords that stand alone. Within that content you might list locations or services unique to each city. If the services are all the same, could you make 1 unique page describing those services? That way you could simply list the cities and use text links to the services page. Good luck- Handcrafter
-
If you saw that address in a search result would you actually click it?
I'd say cramming that many keywords in to a URL would send a bad signal anyway. More to the point though it is going to look like someone jumped in Marty McFly's Delorean and came back with a boot full of spam from the late 1990s.
Ranking is NOT the most important thing (even if this would help, which I would doubt). If the listing looks poor quality then that ranking will bring less traffic. Less traffic means less money.
I would much rather see a short URL without the keywords, and use the keywords in the title. Better still break it up in to every page that makes logical sense and have an appropriate URL and matching content for each. There is no "trying" to avoid duplicate content though - you have to avoid it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Structure of HTML Page
Hello, Is is true that search engine give more value to some part of the page than other ? Is only the main content considered ? or are the other also given weight but very small weight ? If I have div in the main content as those considered par of the main content or no ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Many New Urls at once
Hi, I have about 5,000 new URLs to publish. For SEO/Google - Should I publish them gradually, or all at once is fine? *By the way - all these URLs were already indexed in the past, but then redirected. Cheers,
Intermediate & Advanced SEO | | viatrading10 -
How to change URL structure in google webmasters
Is there any way to ask google to indexed the website in following URL structure abc.com/category/postname (I have this structure on my website) But Currently google indexed my website posts as - abc.com/postname/category How I can tell google to follow the right structure?
Intermediate & Advanced SEO | | Michael.Leonard0 -
Cleaning up backlinks and changing URLs
Currently we are performing very poorly in organic clicks. We are a e-commerce site with over 2000 products. Issues we thought plagued us: Copied Images from competitors Site wide duplicate content duplicate content from competitor site Number of internal links on a page (300+) Bad backlinks (2.3k from 22 domains and ips) being linked to from sites like m.biz URLs URLs are abbreviated, over 50% lack our keywords Lack of meta descriptions, or too long meta descriptions Current State of fixing these issues: 50% images are now our own Site wide duplicate content near 100% completed Internal links have been dealt with Rewrote content for every product 90% of meta descriptions are fixed From all of these changes we have yet to see increase in traffic...10% increase at best in organic clicks. We think we have penalties on certain URLs. My question for the MOZ community is what is the best way to attack the lack of organic clicks. Our main competition is getting 900% more clicks than us. Any more information you need on the topic let me know and will get back to you.
Intermediate & Advanced SEO | | TITOJAX0 -
URL Formatting - Magento
Hi, We are working with a client on Mangento who URLs are formatting Google friendly eg; productname.html - as seen in site search in Google) but when you click the link to the site it is adding on #.VEWKQxbc754 (or similar) The site is also having some page indexing problems as well Thoughts? specific settings/Add on in magento?
Intermediate & Advanced SEO | | Pure-SEO0 -
Company Blog at a different URL
Ok, I have been doing a lot of work over the past 6 months, disavowing low quality links from spammy directories to our company website, etc. However, my efforts seem to have had a negative, not positive effect. This has brought me back to reconsidering what we are doing as we have lost a good amount of traction on the nationwide Google rankings specifically. Considering our company blog - platinumcctv(dot)net - we have used this blog for a long time to inform customers of new products, software developments and then to provide them links to purchase those components. Last week, I revamped the nearly default wordpress theme to another on a piece of advice. However, someone told me that all of our links should be nofollow, even though it is a company blog because we have many links coming from this domain, and it could be found as spammy. Potato/Potato - But before I start the tedious task of changing every link to no follow on a whim, i searched a lot, but have found no CLEAR substantiation of this. Any ideas? Other recommendations appreciated as well! Platinum-CCTV(dot)com
Intermediate & Advanced SEO | | PTCCTV0 -
Best tools for exploring links?
and not just every single link, but ones you know that Google is actually indexing. I find seomoz to be super easy, but there is no way to distinguish links that are actually counting "juice", or am i missing something. What about majesticseo - any other similar tools you use when trying to find linking sites that pass juice?
Intermediate & Advanced SEO | | imageworks-2612900 -
URL Structure for Directory Site
We have a directory that we're building and we're not sure if we should try to make each page an extension of the root domain or utilize sub-directories as users narrow down their selection. What is the best practice here for maximizing your SERP authority? Choice #1 - Hyphenated Architecture (no sub-folders): State Page /state/ City Page /city-state/ Business Page /business-city-state/
Intermediate & Advanced SEO | | knowyourbank
4) Location Page /locationname-city-state/ or.... Choice #2 - Using sub-folders on drill down: State Page /state/ City Page /state/city Business Page /state/city/business/
4) Location Page /locationname-city-state/ Again, just to clarify, I need help in determining what the best methodology is for achieving the greatest SEO benefits. Just by looking it would seem that choice #1 would work better because the URL's are very clear and SEF. But, at the same time it may be less intuitive for search. I'm not sure. What do you think?0