Link Building: Location-specific pages
-
Hi! I've technically been a member for a few years, but just recently decided to go Pro (and I gotta say, I'm glad I did!).
Anyway, as I've been researching and analyzing, one thing I noticed a competitor is doing is creating location-specific pages. For example, they've created a page that has a URL similar to this: www.theirdomain.com/seattle-keyword-phrase
They have a few of these for specific cities. They rank well for the city-keyword combo in most cases. Each city-specific page looks the same and the content is close to being the same except that they drop in the "seattle keyword phrase" bit here and there.
I noticed that they link to these pages from their site map page, which, if I were to guess, is how SEs are getting to those pages. I've seen this done before on other sites outside my industry too. So my question is, is this good practice or is it something that should be avoided?
-
As stated, having a sub directory works, but I don't think it gives that much of a benefit over the example you gave. But yes location and geo targeting with specific pages can be a great strategy. It works well for me, but I'm a local business so everything I do is defined by location. What you want to avoid is creating pages with duplicate content just to appear local. Simply changing out keyword locations in the content is not going to give you a sustainable advantage. If you are going to create GEO specific pages then make content unique to that location. This is just good for SEO but it's good for selling and converting as well.
-
Sub domains can also turn into a real mess!
-
That's the right bias to have!
-
Ah, I do see what you mean. Thanks for the input. I tend to stay away from subdomains as general practice anyway. My own personal bias as a web designer/dev I think.
-
I agree!
-
Yikes! Who would want to start over with link building to a subdomain!?
-
Angie,
I would have to say this is not a "bad practice" Matt does not say it is bad or spammy nor does Google. It also would really depend on your site structure as what the best way to do this. My site it structured just like this as well as all of my major competitors except for one.
They do use sub domains for example: Seattle.mydomain.com
And I have to tell you in my opinion it is not as effective as the way I and many others do it. A good example of what I am saying is in the real estate industry. Go to Google and search "seattle homes for rent" or "seattle homes for sale" And you will see what I am talking about. You also will see one company uses a sub domain plus a directory to target the location for the users search. the result looks like this:
washington.theirdomain.com/Seattle.In this instance it does work well but if you do some searches in other major markets or just some different terms for this industry you will see all the big sites have the structure of www.theirdomain.com/target-city
And it works well and always have for years. But who knows if Google wakes up tomorrow in a bad mood or not?Good Luck!
-
Glad I could help
-
That. Is. Awesome. Thank you. Somehow I missed that video this summer (I subscribe to those Google Webmaster videos).
-
From the Matt Cutts video I saw earlier: http://www.youtube.com/watch?v=c9vD9KGK7G8&feature=player_embedded
It seems like it would be better to put the Geo specific pages on a subdirectory of your website, and geo target it with Webmaster tools. Then, you can start building local, and relevant, links to that page or directory.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Disavow links leading to 404
Looking at the link profile anchor text of a site i'm working on new links keep popping up in the reports with let's say very distasteful anchor text. These links are obviously spam and link to old forum pages for the site that doesn't exist any more, so the majority seem to trigger the 404 page. I understand that the 404 page (404 header response) does not flow any link power, or damage, but given the nature and volume of the sites linking to the "domain" would it be a good idea to completely disassociate and disavow these domains?
White Hat / Black Hat SEO | | MickEdwards0 -
How to ignore spam links to page?
Hey Moz pals, So for some reason someone is building thousands of links to my websites (all spam), likely someone doing negative seo on my site. Anyway, all these links are pointing to 1 sub url on my domain. That url didn't have anything on it so I deleted the page so now it comes up with a 404. Is there a way to reject any link that ever gets built to that old page? I don't want all this spam to hurt my website. What do you suggest?
White Hat / Black Hat SEO | | WongNs0 -
When to NOT USE the disavow link tool
Im not here to say this is concrete and should never do this, and please if you disagree with me then lets discuss. One of the biggest things out there today especially after the second wave of Penguin (2.0) is the fear striken web masters who run straight to the disavow tool after they have been hit with Penguin or noticed a drop shortly after. I had a friend who's site who never felt the effects of Penguin 1.0 and thought everything was peachy. Then P2.0 hit and his rankings dropped of the map. I got a call from him that night and he was desperately asking me for help to review his site and guess what might have happened. He then tells me the first thing he did was compile a list of websites back linking to him that might be the issue and create his disavow list and submitted it. I asked him "How long did you research these sites before you came the conclusion they were the problem?" He Said "About an hour" Then I asked him "Did you receive a message in your Google Webmaster Tools about unnatural linking?" He Said "No" I said "Then why are you disavowing anything?" He Said "Um.......I don't understand what you are saying?" In reading articles, forums and even here in the Moz Q/A I tend to think there is some misconceptions about the disavow tool from Google that do not seem to be clearly explained. Some of my findings with the tool and when to use it is purely based on logic IMO. Let me explain When to NOT use the tool If you spent an hour reviewing your back link profile and you are to eager to wait any longer to upload your list. Unless you have less than 20 root domains linking to you, you should spend a lot more than an hour reviewing your back link profile You DID NOT receive a message from GWT informing you that you had some "unnatural" links Ill explain later If you spend a very short amount of time reviewing your back link profile. Did not look at each individual site linking to you and every link that exists, then you might be using it WAY TO SOON. The last thing you want to do is disavow a link that actually might be helping you. Take the time to really look at each link and ask your self this question (Straight from the Google Guidelines) "A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee" Studying your back link profile We all know when we have cheated. Im sure 99.9% of all of us can admit to it at one point. Most of the time I can find back links from sites and look right at the owner and ask him or her "You placed this back link didn't you?" I can see the guilt immediately in their eyes 🙂 Remember not ALL back links you generate are bad or wrong because you own the site. You need to ask yourself "Was this link necessary and does it apply to the topic at hand?", "Was it relevant?" and most important "Is this going to help other users?". These are some questions you can ask yourself before each link you place. You DID NOT receive a message about unnatural linking This is were I think the most confusing takes place (and please explain to me if I am wrong on this). If you did not receive a message in GWT about unnatural linking, then we can safely say that Google does not think you contain any "fishy" spammy links in which they have determined to be of a spammy nature. So if you did not receive any message yet your rankings dropped, then what could it be? Well it's still your back links that most likely did it, but its more likely the "value" of previous links that hold less or no value at all anymore. So obviously when this value drops, so does your rank. So what do I do? Build more quality links....and watch you rankings come back 🙂
White Hat / Black Hat SEO | | cbielich1 -
Are links from directories good or bad?
I've done a lot of competitive link analysis lately and found that a lot of my competitors links for a certain keyword are coming from low quality directory sites and they're outranking my site. This leads me to my question which may or may not have an answer(I at least hope it fuels a good discussion)... Are links from directory sites good or bad for SEO?
White Hat / Black Hat SEO | | TylerReardon0 -
Best way to build links?
i want to build high piority links and some high pr one's. what tool should i use? i was thinking of using scrapbox. any insights? i already have 2 high ones from youtube and google +1
White Hat / Black Hat SEO | | Radomski0 -
Methods for getting links to my site indexed?
What are the best practices for getting links to my site indexed in search engines. We have been creating content and acquiring backlinks for the last few months. They are not being found in the back link checkers or in the Open Site Explorer. What are the tricks of the trade for imporiving the time and indexing of these links? I have read about some RSS methods using wordpress sites but that seems a little shady and i am sure google is looking for that now. Look forward to your advice.
White Hat / Black Hat SEO | | devonkrusich0 -
How Do You Determine If A Link Is Quality?
What tools and signals do you use to determine if a link is quality or not? How can you tell if a link is going to hurt your ranking?
White Hat / Black Hat SEO | | anchorwave0