Use of Location Folders
-
I'd like to understand the pro's and con's of using a location subfolder as an SEO strategy (example: http://sqmedia.us/Dallas/content-marketing.html), where the /Dallas folder is holding all of my keyword rich page titles. The strategy is to get local-SEO benefits from the use of the folder titled /Dallas (a folder which is unnecessary in the over all structure of this site), but how much is this strategy taking away from the page-title keyword effectiveness?
-
Hello SEO5,
Since I'm a virtual business wanting to rank well here in Dallas, as well as build my ranking nationally over time,
you're feedback on these two examples would be interesting:
http://sqmedia.us/dallas-tx/customer-experience-optimization.html
http://sqmedia.us/customer-experience-optimization-dallas-tx.html
The first address is the easiest to work with, as I have a keyword in each page URL, although having the keyword not so far back in the URL might be better for keyword ranking. Any thoughts?
Thanks,
Steve
-
Hi Miriam,
Yes, I have a local number and a unique physical address. I'll change the phone number on the site to the local number instead of the 888 number, and forward to my service. I'll also aim for organic results. I'm a virtual business wanting to rank well here in Dallas where I'm located. Many thanks, Steve
-
Hi Steve,
Do you have a unique physical office and dedicated local phone number in Dallas? This is the only way to go after true local rankings for any service. Without this, your SEO efforts will need to have organic, rather than local, results as a goal.
If you are virtual business attempting to get some organic traffic for different cities where you have clients, which is what I'm guessing the hope is here, it would be most typical simply to create a city landing page high in the architecture of the site for each target city. So you'd have: mysite.com/dallas-content-marketing-services or mysitecome/austin-content-marketing-services
I don't see a need to put these things in different folders, but, I definitely do see a need to be sure you are creating totally unique content for each of these landing pages. That is critical.
Does this help answer your question? If not, feel free to provide further details.
-
Hi Steve,
I'd like to reference this site www.cityfeet.com that i did in one of my earlier posts. This site ranks extremely well for any geo targeted search for the keyword "office space for lease" . Type in " office space for lease dallas" or "office space for lease new york" they are always in the top search results. Their URL structure is below:
http://www.cityfeet.com/cont/new-york-office-space
http://www.cityfeet.com/cont/tx/dallas-office-space
So it seems like you may have to adopt a dual structure on the site. For the more competitive search terms you can have the keyword close to the domain name. For the others , you can have a state abbreviation and the city keyword next.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unique use of nofollow tag
Love the community here. I just had a quick question about the using noindex, nofollow. We are a car dealership group that uses a website provider (cobalt). Since they provide the website they are the only ones with access to remove pages etc. We can add pages but only they can remove them. There are some pages we need to have removed but according to them they are unable to remove them, (I think the manufacture might mandate having some pages), anyway some of these pages literally have nothing on them, and there isn't really any useful content we could add to them. So we are using noindex on them to ensure that they stay out of search indices, but I am wondering if we should also use nofollow on them. If I understand nofollow correctly it just means search engines won't follow the links on the page, well for most of these pages the only links on them are the navigation, and since we don't plan on adding any content to these pages and we can't remove them should we use noindex and nofollow as a way to "remove" them from the site as much as we can?
Technical SEO | | Murdock_Auto_Group0 -
Should I use a canonical tag or 301 with Wordpress posts?
Hi all, I'm trying to determine if canonical or 301 is a better way of handling an issue on my site. The Background I've got a Wordpress website where pages are in-depth reference articles and the posts are for short news blurbs. When I produce a new resource page, I also make a short post telling readers about the new resource. I use Yoast's Wordpress SEO plug in. Sometimes, Google will rank the 200 word post higher than the 2000 word resource page. I suspect that is because of the order in which they were crawled by Google, but I do not know for sure. The Question To make sure that the resource page is seen as the most important location on the site for the topic, should I use the canonical section in the Yoast plugin on the post to point to the page? Or should I wait, and after a few days (when the news blurb is off of the first page) just 301 the post to the page? Are there any link juice considerations when using the canonical option? Thanks for the help! Richard
Technical SEO | | RichardInFlorida0 -
Using meta keywords
Hello, I've read conflicting opinions regarding the use of meta keywords. I know Matt Cutts stated a few years ago that Google does not use them as a ranking signal but many still use them. This raises a few questions: 1. Is there any reason to still use meta keywords? 2. Isn't it just a way to let your competitors know what you're optimizing for? 3. How do other search engines treat meta keywords? Thanks!
Technical SEO | | mirel0 -
Some posts not showing on Google search if I seach them using post title?
Hello! Some of my WordPress blog posts aren't showing on the Google search result, even if I type the post title. What could be the issue? Is it my site text selection disabled issue or WordPress SEO by Yoast plug-in issue or something else? Moreover, if I search some of my post text (article content), I can't see relevant post on Google search. I use following code to protect my articles. Is it SEO friendly? .content {
Technical SEO | | Godad
-webkit-user-select: none;
-khtml-user-select: none;
-moz-user-select: none;
-ms-user-select: none;
user-select: none;
} my site:- http://goo.gl/tD2fS Thanks!0 -
New URL or Folder Off Existing Site
I am working on a project that is promoting dining in a particular region of the southwest for a destination marketing company. The parent Web site is an authority in the region and ranks well for almost all terms related to the leisure experience in the region. A completely separate Web site was built to promote this culinary program as it involves a committee of different stakeholders, but it’s solely focused on the region. My question is this. The site is on a different CMS, etc., but the overall experience on the site is similar to the parent DMO site in terms of creative. The client has a brand new domain that they purchased for this initiative, but we are also considering mapping the parent site URL to the new culinary site. Parent: www.regionalsite.com New Themed Site: www.regionalsite.com/theme/ Or www.themeurl.com My fear is that if I take the approach of the new URL that it will take forever for the site to build any link clout at all, as the client doesn’t really get the fact that working a link strategy is so critical. However, I know that having links from the regional site over to the theme URL will have an impact. Also, if I do take the approach of mapping the URL to a new folder off of the parent domain, do I risk that 2<sup>nd</sup> tier links on the micro-site will have a challenge indexing as they will essentially be on tier 3? Any advice would be appreciated.
Technical SEO | | VERBInteractive0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0 -
Will using https across our entire site hurt our external backlinks?
Our site is secured throughout, so it loads sitewide as https. It is canonicalized properly - any attempt to load an existing page as http will force to https. My concern is with backlinks. We've put a lot of effort into social media, so we're getting some nice blog linkage. The problem is that the links are generally to http rather than https (understandable, since that's the default for most web users). The site still loads with no problem, but my concern is that since a redirect doesn't transfer all the link juice across, we're leaking some perfectly good link credit. From the standpoint of backlinkage, are we harming ourselves by making the whole site secure by default? The site presently isn't very big, but I'm looking at adding hundreds of new pages to the site, so if we're going to make the change, now is the time to do so. Let me know what you think!
Technical SEO | | ufmedia0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0