Link Building: Location-specific pages
-
Hi! I've technically been a member for a few years, but just recently decided to go Pro (and I gotta say, I'm glad I did!).
Anyway, as I've been researching and analyzing, one thing I noticed a competitor is doing is creating location-specific pages. For example, they've created a page that has a URL similar to this: www.theirdomain.com/seattle-keyword-phrase
They have a few of these for specific cities. They rank well for the city-keyword combo in most cases. Each city-specific page looks the same and the content is close to being the same except that they drop in the "seattle keyword phrase" bit here and there.
I noticed that they link to these pages from their site map page, which, if I were to guess, is how SEs are getting to those pages. I've seen this done before on other sites outside my industry too. So my question is, is this good practice or is it something that should be avoided?
-
As stated, having a sub directory works, but I don't think it gives that much of a benefit over the example you gave. But yes location and geo targeting with specific pages can be a great strategy. It works well for me, but I'm a local business so everything I do is defined by location. What you want to avoid is creating pages with duplicate content just to appear local. Simply changing out keyword locations in the content is not going to give you a sustainable advantage. If you are going to create GEO specific pages then make content unique to that location. This is just good for SEO but it's good for selling and converting as well.
-
Sub domains can also turn into a real mess!
-
That's the right bias to have!
-
Ah, I do see what you mean. Thanks for the input. I tend to stay away from subdomains as general practice anyway. My own personal bias as a web designer/dev I think.
-
I agree!
-
Yikes! Who would want to start over with link building to a subdomain!?
-
Angie,
I would have to say this is not a "bad practice" Matt does not say it is bad or spammy nor does Google. It also would really depend on your site structure as what the best way to do this. My site it structured just like this as well as all of my major competitors except for one.
They do use sub domains for example: Seattle.mydomain.com
And I have to tell you in my opinion it is not as effective as the way I and many others do it. A good example of what I am saying is in the real estate industry. Go to Google and search "seattle homes for rent" or "seattle homes for sale" And you will see what I am talking about. You also will see one company uses a sub domain plus a directory to target the location for the users search. the result looks like this:
washington.theirdomain.com/Seattle.In this instance it does work well but if you do some searches in other major markets or just some different terms for this industry you will see all the big sites have the structure of www.theirdomain.com/target-city
And it works well and always have for years. But who knows if Google wakes up tomorrow in a bad mood or not?Good Luck!
-
Glad I could help
-
That. Is. Awesome. Thank you. Somehow I missed that video this summer (I subscribe to those Google Webmaster videos).
-
From the Matt Cutts video I saw earlier: http://www.youtube.com/watch?v=c9vD9KGK7G8&feature=player_embedded
It seems like it would be better to put the Geo specific pages on a subdirectory of your website, and geo target it with Webmaster tools. Then, you can start building local, and relevant, links to that page or directory.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
Rank drop after link reclamation
Link reclamation is good activity interms of technical SEO and UX. But I noticed couple of times rank drop post the link reclamation activity. Why does this happen? What might be the cause? Beside redirecting to the most relevant page in contest to the source page content; anything else we must be looking into?
White Hat / Black Hat SEO | | vtmoz0 -
Paid Link/Doorway Disavow - disavowing the links between 2 sites in the same company.
Hello, Three of our client's sites are having difficulty because of past doorway/paid link activity, which we're doing the final cleanup on with a disavow. There are links between the sites. Should we disavow all the links between the sites? Thank you.
White Hat / Black Hat SEO | | BobGW0 -
Low quality links
Hi I have found a lot of links from a majestic report (not found on moz open site explorer). inwhich I have found lots of links from 2010 and possibly earlier which either I can't get hold of the webmaster. Is a disavow the right way to go if I can't get them removed myself? Also I have noticed that there are a lot of free directories listing new pages from our site and I am concerned Google are going to find these. I surpose there is nothing I can do about this, does anyone have any recommendations.
White Hat / Black Hat SEO | | Cocoonfxmedia0 -
Value / Risk of links in comments (nofollow)
Recently I noticed a couple of comments on our blog that seemed nice and relevant so I approved them. The site is wordpress and comments are configured nofollow. We don't get many comments so I thought "why not?". Today I got one and noticed they are all coming from the same IP. They all include urls to sites in the same industry as us, relevant sites and all different. Looks like an SEO is doing it for various clients. My question is what is the value of these nofollow links for the poster? Are these seen as "mentions" and add value to Google? And am I better off trashing them so my site is not associated? Thanks
White Hat / Black Hat SEO | | Chris6610 -
How to Handle Sketchy Inbound Links to Forum Profile Pages
Hey Everyone, we recently discovered that one of our craft-related websites has a bunch of spam profiles with very sketchy backlink profiles. I just discovered this by looking at the Top Pages report in OpenSiteExplorer.org for our site, and noticed that a good chunk of our top pages are viagra/levitra/etc. type forum profile pages with loads of backlinks from sketchy websites (porn sites, sketchy link farms, etc.). So, some spambot has been building profiles on our site and then building backlinks to those profiles. Now, my question is...we can delete all these profiles, but how should we handle all of these sketchy inbound links? If all of the spam forum profile pages produce true 404 Error pages (when we delete them), will that evaporate the link equity? Or, could we still get penalized by Google? Do we need to use the Link Disavow tool? Also note that these forum profile pages have all been set to "noindex,nofollow" months ago. Not sure how that affects things. This is going to be a time waster for me, but I want to ensure that we don't get penalized. Thanks for your advice!
White Hat / Black Hat SEO | | M_D_Golden_Peak0 -
Need clarification on what is a landing page vs. doorway page
Hello everyone - I just became a PRO member today and wanted to say hello and ask this question... I am launching a new product, but 6 months before I created 4 different domains with landing pages to "prime" my SEO for the keywords I am trying to pursue. Now that I have launched my new product, it resides on the main domain name (let's call it "MainDomain.com"). Here's my dilemma... I want to create landing pages on each of the different domains for my PPC and optimized organic search traffic. For example, on one of the other domains (let's call it "LandingDomain1.com"), I have created a page to optimize for the keyword "event planning software" and sending my PPC traffic for "event planning software" there as well as my email campaigns. This page has original content that I have written for it (it's not duplicate content used elsewhere), but it also has navigation and links pointing to MainDomain.com, which is where we convert and collect registrations. My question is, will this activity be considered a doorway page even though I'm using it for a landing page for a particular audience? And, if it could be considered a doorway page, would I be better off moving all these optimized landing pages to my MainDomain.com and then doing a 301 redirect from those other domains to the MainDomain.com. Your input is much appreciated ... thanks.
White Hat / Black Hat SEO | | DenverDude1 -
Retail Site and Internal Linking Best Practices
I am in the process of recreating my company's website and, in addition to the normal retail pages, we are adding a "learn" section with user manuals, reviews, manufacturer info, etc. etc. It's going to be a lot of content and there will be linking to these "learn" pages from both products and other "learn" pages. I read on a SEOmoz blog post that too much internal linking with optimized anchor text can trigger down-rankings from Google as a penalty. Well, we're talking about having 6-8 links to "learn" pages from product pages and interlinking many times within the "learn" pages like Wikipedia does. And I figured they would all have optimized text because I think that is usually best for the end user (I personally like to know that I am clicking on "A Review of the Samsung XRK1234" rather than just "A Review of Televisions"). What is best practice for this? Is there a suggested limit to the number of links or how many of them should have optimized text for a retail site with thousands of products? Any help is greatly appreciated!
White Hat / Black Hat SEO | | Marketing.SCG0