Large scale geo-targeting?
-
Hi there. We are an internet marketing agency and recently did a fair amount of working trying to optimise for a number of different locations. Although we are based in Preston (UK), we would like to attract clients from Manchester, Liverpool, etc.
We created landing pages for each of the locations that we wanted to target and each of the services - so we had an SEO Manchester page and a Web Design Manchester page for example. These were all written individually by a copywriter in order to avoid duplicate content. An example of one of the first of these pages is here: http://www.piranha-internet.co.uk/places/seo-blackpool.php
We created a 'where we cover' page and used a clickable map rather than huge long list of text links, which we felt would be spammy, to link through to these pages. You can see this page here: http://www.piranha-internet.co.uk/where-we-cover.php
Initially we gained a great deal of success from this method - with the above Blackpool page ranking #7 for "SEO Blackpool" within a week. However these results quickly disappeared and now we don't rank at all, though the pages remain in the index. I'm aware that we don't have many external links pointing to these pages, but this cannot explain why these pages don't rank at all, as some of the terms are relatively non-competitive.
A number of our competitors rank for almost all of these terms, despite their pages being exact duplicates with simply the city/town name being changed. Any ideas where we've gone wrong?
-
I'm from Burnley originally and I've worked in Blackburn and Manchester previously but now I live and work in Dublin, Ireland It's nice to see somebody local on here.
I would suggest Social Bookmarking the new pages that you have created and I think you'll be surprised at what will happen, something so simple. Have you updated your sitemap as well?
-
Thanks for the reply Glenn. I really can't see why we would have been penalised as everything we do is above board, although it does seem as if that might be the case. I certainly think that the QDF point you make is a valid one, although it could have been around the time of the latest Panda update too, so perhaps that might have flagged up something.
I think our next step might be to recreate the pages from scratch on entirely new URLs and see if that has any effect. We will certainly try and poach some of our competitor's links too!
-
It's possible that your site has been penalized, though I don't see too many reasons why it would be in reviewing your OSE report. From a cursory investigation, I'd say you've done a great job earning the links pointing to your site... though if any trickery was involved, you may be penalized, so you may want to investigate how to get out of that trap.
I suggest you investigate the link profiles of the competitors who rank for almost all of your targeted terms. If your on-page SEO is truly better than there's, it's likely that their external link profile is earning them the rankings you desire. Learn from their strategy.
Your initial high rankings could have been related to QDF.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
At scale way to check content in google?
Is there any tools people know about where I can verify that Google is seeing all of our content at scale. I know I can take snippets and plug them into Google to see if we are showing up, but this is very time consuming and want to know across a bulk of pages.
Intermediate & Advanced SEO | | HashtagHustler0 -
Large robots.txt file
We're looking at potentially creating a robots.txt with 1450 lines in it. This will remove 100k+ pages from the crawl that are all old pages (I know, the ideal would be to delete/noindex but not viable unfortunately) Now the issue i'm thinking is that a large robots.txt will either stop the robots.txt from being followed or will slow our crawl rate down. Does anybody have any experience with a robots.txt of that size?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Targeting two search terms with same intent - one or more pages for SEO benefits?
I'd like some professional opinions on this topic. I'm looking after the SEO for my friends site, and there are two main search terms we are looking to boost in search engines. The company sells Billboard advertising space to businesses in the UK. Here are the two search terms we're looking to target: Billboard Advertising - 880 searches P/M Outdoor Advertising - 720 searches P/M It would usually make sense to make a separate page to target the keyword "billboard advertising" on its own fully optimised landing page with more information on the topic and with a targeted URL: www.website.com/billboard-advertising/ and the homepage to target "outdoor advertising" as it's an outdoor advertising agency. But there's a problem, as both search terms are highly related and have the same intent, I'm worried that if we create a separate page to target the billboard advertising, it will conflict with the homepage targeting outdoor advertising. Also, the main competitors who are currently ranked position 1-3, are ranking with their home pages and not optimised landing pages to target the exact search term "billboard advertising". Any advice on this?
Intermediate & Advanced SEO | | Jseddon920 -
How many keywords should each of my pages realistically be targeting?
Hi All, I run a small bank's website and we're currently in the process of organising a site rebuild. Whilst this will be extensive and have many SEO factors to tick off, my concern now is to get a "realistic" number of keywords each of my pages should be targeting. For instance for my car loan page i've done a review on moz's keyword tool and have picked 3 or 4 good keywords - but the problem is there are realistically 7-8 that would suit. Also this is based on Bing's info only. Can anybody point me in the right direction (or have some google confirmed resource they can quote me) Cheers as always 🙂 Dave
Intermediate & Advanced SEO | | CFCU0 -
ECommerce keyword targeting: Blog post vs Category page
I'm targeting short head and chunky middle keywords for generating traffic to an ecommerce website. I guess I have two options both with great content: blog posts category pages with content (essentially the blog post). On the basis that it is great content that gets links, I would hope that I could garner links into the heart of the eCommerce website by doing this through option 2: category pages. Any thoughts on blog vs ecommerce category pages for tageting keywords?
Intermediate & Advanced SEO | | BruceMcG0 -
How do you find a truely knowledgable SEO person to analyze are large site?
We are a large site, 5600 pages with local pages in almost every city across the US. We are struggling with page rank on some pages and I dont think its as simple as backlinks and its definitely not poor on-page SEO. I think we might have some truly technical issues that is causing us to get penalized in SERP's. Any agencies which analyze sites? This is NOT a job posting so please don't send me messages...I truly want to know how/where to find a solution to our problem. Thanks
Intermediate & Advanced SEO | | CTSupp0 -
Which page to target? Home or /landing-page
I have optimized my home page for the keyword "computer repairs" would I be better of targeting my links at this page or an additional page (which already exists) called /repairs it's possible to rename & 301 this page to /computer-repairs The only advantage I can see from targeting /computer-repairs is that the keywords are in the target URL.
Intermediate & Advanced SEO | | SEOKeith0 -
Targetting site in 3 countries
I have read the seomoz post at - http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday before asking the question We recieved a query from one of our client regarding targetting his site in 3 different countries namely - US,UK and Australia. Specifically, he has asked us- 1. Whether i should buy ccTLD like - www.example.co.uk www.example.com.au www.example.com and write unique content for each of the above. or 2. or go for subfolder approach www.example.com/UK www.example.com/AU will it affect SEO if the subfolders are in CAPS. Would like to have advice of moz community on what advice will be the best. Thanks
Intermediate & Advanced SEO | | seoug_20050