Link Building: Location-specific pages
-
Hi! I've technically been a member for a few years, but just recently decided to go Pro (and I gotta say, I'm glad I did!).
Anyway, as I've been researching and analyzing, one thing I noticed a competitor is doing is creating location-specific pages. For example, they've created a page that has a URL similar to this: www.theirdomain.com/seattle-keyword-phrase
They have a few of these for specific cities. They rank well for the city-keyword combo in most cases. Each city-specific page looks the same and the content is close to being the same except that they drop in the "seattle keyword phrase" bit here and there.
I noticed that they link to these pages from their site map page, which, if I were to guess, is how SEs are getting to those pages. I've seen this done before on other sites outside my industry too. So my question is, is this good practice or is it something that should be avoided?
-
As stated, having a sub directory works, but I don't think it gives that much of a benefit over the example you gave. But yes location and geo targeting with specific pages can be a great strategy. It works well for me, but I'm a local business so everything I do is defined by location. What you want to avoid is creating pages with duplicate content just to appear local. Simply changing out keyword locations in the content is not going to give you a sustainable advantage. If you are going to create GEO specific pages then make content unique to that location. This is just good for SEO but it's good for selling and converting as well.
-
Sub domains can also turn into a real mess!
-
That's the right bias to have!
-
Ah, I do see what you mean. Thanks for the input. I tend to stay away from subdomains as general practice anyway. My own personal bias as a web designer/dev I think.
-
I agree!
-
Yikes! Who would want to start over with link building to a subdomain!?
-
Angie,
I would have to say this is not a "bad practice" Matt does not say it is bad or spammy nor does Google. It also would really depend on your site structure as what the best way to do this. My site it structured just like this as well as all of my major competitors except for one.
They do use sub domains for example: Seattle.mydomain.com
And I have to tell you in my opinion it is not as effective as the way I and many others do it. A good example of what I am saying is in the real estate industry. Go to Google and search "seattle homes for rent" or "seattle homes for sale" And you will see what I am talking about. You also will see one company uses a sub domain plus a directory to target the location for the users search. the result looks like this:
washington.theirdomain.com/Seattle.In this instance it does work well but if you do some searches in other major markets or just some different terms for this industry you will see all the big sites have the structure of www.theirdomain.com/target-city
And it works well and always have for years. But who knows if Google wakes up tomorrow in a bad mood or not?Good Luck!
-
Glad I could help
-
That. Is. Awesome. Thank you. Somehow I missed that video this summer (I subscribe to those Google Webmaster videos).
-
From the Matt Cutts video I saw earlier: http://www.youtube.com/watch?v=c9vD9KGK7G8&feature=player_embedded
It seems like it would be better to put the Geo specific pages on a subdirectory of your website, and geo target it with Webmaster tools. Then, you can start building local, and relevant, links to that page or directory.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Too many dofollow links = penalty?
Hi. I currently have 150 backlinks, 90% of them are dofollow, while only 10% are nofollow. I recently hit position #10 for my main keyword, but now it is dropped to #16 and a lot of related keywords are gone. So I have a few questions: 1. Was my website penalized for having an unnatural backlink profile (too many dofollow links), or maybe this drop in positions is just a temporary, natural thing? 2. Isn’t it too late for making the backlink profile look more natural by building more nofollow backlinks and making it 50%/50%? Thank you!
White Hat / Black Hat SEO | | NathalieBr0 -
Which is Important? Backlinks or Internal Links? For SEO purpose.
Which is Important? Backlinks or Internal Links? For SEO purpose.
White Hat / Black Hat SEO | | BBT-Digital0 -
Obscene anchor text linking to non-existent pages on my site
My website seems to be rapidly accumulating links from what seem to be reputable websites and which are going to non-existent pages on my website. The anchor text of many of these links is obscene. Here is the URL of one of the pages that is linking to me. I contacted the originating site a couple of weeks ago and they are looking into it but I've not heard back. I'm guessing the originating sites have been hacked. Should I be concerned? Why are they linking to pages on my site that don't exist? http://www.radicalartistsagency.com/htmlarea/language/0content_abo_utus.html Looking at the page source of this page reveals the hidden links.
White Hat / Black Hat SEO | | MartinDS0 -
Unwanted link ?
Hello Working on my 404 pages, I've just found the following http://awesomescreenshot.com/08d22txtc9 This website http://basilurteaindia.com has a link mine as checked into Google. Link is presented with some of my content here http://basilurteaindia.com/images/19022012list.asp?type=2&file=C%3A%5CProgram+Files+(x86)%5ChMailServer%5CData%5Cace-egy.com%5Cm.kilany%5C9A%5C%7B9A532C2F-FB00-4C72-9403-7F26B7DC8E54%7D.eml Does someone know what the hell is that and how to remove it ?
White Hat / Black Hat SEO | | AymanH0 -
Advice on links after Penguin hit
Firstly we have no warnings or messages in WMT. We have racked up thousands of anchor text urls. Our fault, we didnt nofollow and also some of our many cms sites replicated the links sitewide to the tune of 20,000 links. I`m in the process of removing the code which causes this problem in most of the culprit sites but how long will it take roughly for a crawl to recalculate the links? In my WMT it still shows the links increasing but I think this is retrospective data. However, after this crawl we should see a more relevant link count. We also provide some web software which has been used by many sites. Google may consider our followed anchor text violating spam rules. So I ask, if we were to change the link text to our url only and add nofollow, will this improve the spam issue? We could have as many as 4,000 links per website, as it is a calendar function and list all dates into the future.......and we would like to retain a link to our website of course for marketing purposes. What we dont want is sitewide link spam again. Some of our other links are low quality, some are okay. However, we have lost rankings, probably due to low quality links and overuse of anchor text.. Is this the case the Google has just devalued the links algorythmically or is there an actual penalty to make the rankings drop? As we have no warnings in WMT, I feel there isnt the need to remove the lower quality links and in most cases we havent control over the link placements. We should just rectify that we have a better future linking profile? If we have to remove spam links, then that can only be a good reason to cause negative seo?
White Hat / Black Hat SEO | | xtopher660 -
Is Guest Blogging the Next Link Buying
I like the guest blogging idea for two reasons. One, it builds links, and two, it allows me to add content to a lot of blogs that are really interested in growing a lot of good content. But I often read articles that give credit to another article, that give credit to another article. I have been offered plenty of documents for client blogs, but I am worried that at some point in the future Google will decide all this guest blogging is similar to link trading and selling. What does everyone else think of guest blogging?
White Hat / Black Hat SEO | | HandsomeWeb1 -
On page SEO? (This is good! I promise)
I have been doing some research on onsite optimization and I hit a dead end, need some help with OnSite.... These three I get for the most part... (If you would like to add anything please do) Title optimization - needs to be unique with keywords included under 90 words Meta description - needs to be unique with keywords included under 150 words Meta keywords – all keywords Questions begin here... H1 headings – Should this be the first thing the spider crawls? Should they be unique? Is there a penalty for having this content the same on every page? (H1s are under the logo at the top of every one of my sites pages) H2-H6 headings – Should they be unique? Is there a penalty for having this content the same on every page? Bold text – does this matter for SEO? Italic text - does this matter for SEO? Link anchor text – These are the same on most pages. However, most of these links are part of the navigation, does this matter for SEO? is this duplicate? how does the search engine analyze this data? Image alt attributes – I have the share image buttons on my site (Facebook, Twitter, etc...) and they have the same alt attributes on each page. Does this matter for SEO? Body text – I found a competitor site that’s ranking #1 for a key term. This competitor has 11,106 words in their body with the keyword mentioned 29 times (0.8%). They placed all this text in a small scroll down on the bottom of their page. Its strange how they included it. Please review attached image. the competitor URL is http://www(dot)1804design(dot)com/ w6AiM.png
White Hat / Black Hat SEO | | SEODinosaur0 -
Page Rank is 0
Hi. Can you please point me in the right direction concerning a site whose default page has a PR of 0? There does not appear to be any errors in the robots.txt file (that I can tell). When I ran a duplicate content check by searching the title tag and first sentance in quotes it did not return more than 2 sites. When I ran a site: it is reporting 287,000 results. Does this mean that they purchased links and have now been penalized? Or where should I go from here? Thank you for any feedback and assistance.
White Hat / Black Hat SEO | | JulB0