Use of Location Folders
-
I'd like to understand the pro's and con's of using a location subfolder as an SEO strategy (example: http://sqmedia.us/Dallas/content-marketing.html), where the /Dallas folder is holding all of my keyword rich page titles. The strategy is to get local-SEO benefits from the use of the folder titled /Dallas (a folder which is unnecessary in the over all structure of this site), but how much is this strategy taking away from the page-title keyword effectiveness?
-
Hello SEO5,
Since I'm a virtual business wanting to rank well here in Dallas, as well as build my ranking nationally over time,
you're feedback on these two examples would be interesting:
http://sqmedia.us/dallas-tx/customer-experience-optimization.html
http://sqmedia.us/customer-experience-optimization-dallas-tx.html
The first address is the easiest to work with, as I have a keyword in each page URL, although having the keyword not so far back in the URL might be better for keyword ranking. Any thoughts?
Thanks,
Steve
-
Hi Miriam,
Yes, I have a local number and a unique physical address. I'll change the phone number on the site to the local number instead of the 888 number, and forward to my service. I'll also aim for organic results. I'm a virtual business wanting to rank well here in Dallas where I'm located. Many thanks, Steve
-
Hi Steve,
Do you have a unique physical office and dedicated local phone number in Dallas? This is the only way to go after true local rankings for any service. Without this, your SEO efforts will need to have organic, rather than local, results as a goal.
If you are virtual business attempting to get some organic traffic for different cities where you have clients, which is what I'm guessing the hope is here, it would be most typical simply to create a city landing page high in the architecture of the site for each target city. So you'd have: mysite.com/dallas-content-marketing-services or mysitecome/austin-content-marketing-services
I don't see a need to put these things in different folders, but, I definitely do see a need to be sure you are creating totally unique content for each of these landing pages. That is critical.
Does this help answer your question? If not, feel free to provide further details.
-
Hi Steve,
I'd like to reference this site www.cityfeet.com that i did in one of my earlier posts. This site ranks extremely well for any geo targeted search for the keyword "office space for lease" . Type in " office space for lease dallas" or "office space for lease new york" they are always in the top search results. Their URL structure is below:
http://www.cityfeet.com/cont/new-york-office-space
http://www.cityfeet.com/cont/tx/dallas-office-space
So it seems like you may have to adopt a dual structure on the site. For the more competitive search terms you can have the keyword close to the domain name. For the others , you can have a state abbreviation and the city keyword next.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using # in parameters?
I am trying to understand why a website would use # instead of a ? for its parameters? I have put an example of the URL below: http://www.warehousestationery.co.nz/office-supplies/adhesives-tapes-and-fastenings#prefn1=brand&prefn2=colour&prefv1=Command&prefv2=Clear Any help would be much appreciated.
Technical SEO | | CaitlinDW1 -
Can you use Screaming Frog to find all instances of relative or absolute linking?
My client wants to pull every instance of an absolute URL on their site so that they can update them for an upcoming migration to HTTPS (the majority of the site uses relative linking). Is there a way to use the extraction tool in Screaming Frog to crawl one page at a time and extract every occurrence of _href="http://" _? I have gone back and forth between using an x-path extractor as well as a regex and have had no luck with either. Ex. X-path: //*[starts-with(@href, “http://”)][1] Ex. Regex: href=\”//
Technical SEO | | Merkle-Impaqt0 -
Using the same domain for two websites (for different geographical locations)
Hi all, My client has a new E-commerce site coming out in few months.
Technical SEO | | skifr
His requirement is to use the same domain (lets call it www.domain.com for now) for two seperate websites:
The first site, for users with ip addresses from USA - which will include prices in US dollars.
The second site - for users outside of the US - will not include any prices, and will have different pages and design. Now, lets say that googlebot crawls the websites from different ip ranges. How can i make sure a user from France, for example, won't see crawled pages from the US? Sure, once he will click the result, I can redirect him to a "Sorry, but this content is unavailable in your country" page. The problem is, I don't want a user from France to see the in the search results the meta description snippets of pages related only to users in the US (in some cases, the snippets may include the prices in $).
Is Geotargeting through Webmaster Tools can help in this case? I know I can target a part of the website for a specific country (e.g. - www.domain.com/us/), but how can I make sure global users won't see the pages targeted only to the US in the search results? Thanks in Advance0 -
Should we re-use our old re-directed domain for a new wesbite ?
Hi One our clients has an old domain that has been redirected to another website of his. Now he is asking us to build a new website for that domain and direct it back. This is new website will be very relative to it's own old content and where it has been re-directed recently. I guess the only benefit of this would be taking advantage of the age of this domain. Do you recommend doing that or getting him a new domain ?
Technical SEO | | Dynamo-Web0 -
Is it appropriate to use canonical for a yearly post with similar content?
I've begun writing an annual review of local business directories. Post from 2012 is here: http://web.servicecrowd.com.au/blog/top-10-australian-business-directories-in-2012/ New 2014 post is here: http://web.servicecrowd.com.au/blog/top-10-australian-business-directories-2014/ Is this appropriate use? Next year the post will be similar, but different metrics reported and slightly different review. Side note: For some reason the post hasn't been indexed by Google yet. Usually new posts are indexed as soon as they are shared on social media.
Technical SEO | | ServiceCrowd_AU0 -
Sub-domain or sub-folder for a blog?
Traditional thinking suggests sub-domains are treated as separate sites and so don't pass on link juice, but I've heard mixed opinions. I'm very much a believer in sub-folders but I'm interested to hear some other opinions. Thoughts?
Technical SEO | | underscorelive0 -
Does using parentheses affect the crawlers?
Quick question: if you using a parantheses around a keyword, do search bots still recognize the keyword? Fox ex: Welcome to a website about the National Basketball Association (NBA). Will the bots recognize that I'm trying to optimize to NBA and not (NBA)? Is this different for tags vs. actual body copy?
Technical SEO | | BPIAnalytics2 -
Using Thesis as blog platform vs. Tumblr
I read a lot of advantages by using Thesis as a platform for blogging, but I like the themes and other plugins from Tumblr. Are there equivalents at Tumblr to the Thesis benefits so I can go a head and go with Tumblr?
Technical SEO | | HyperOffice0