Large scale geo-targeting?
-
Hi there. We are an internet marketing agency and recently did a fair amount of working trying to optimise for a number of different locations. Although we are based in Preston (UK), we would like to attract clients from Manchester, Liverpool, etc.
We created landing pages for each of the locations that we wanted to target and each of the services - so we had an SEO Manchester page and a Web Design Manchester page for example. These were all written individually by a copywriter in order to avoid duplicate content. An example of one of the first of these pages is here: http://www.piranha-internet.co.uk/places/seo-blackpool.php
We created a 'where we cover' page and used a clickable map rather than huge long list of text links, which we felt would be spammy, to link through to these pages. You can see this page here: http://www.piranha-internet.co.uk/where-we-cover.php
Initially we gained a great deal of success from this method - with the above Blackpool page ranking #7 for "SEO Blackpool" within a week. However these results quickly disappeared and now we don't rank at all, though the pages remain in the index. I'm aware that we don't have many external links pointing to these pages, but this cannot explain why these pages don't rank at all, as some of the terms are relatively non-competitive.
A number of our competitors rank for almost all of these terms, despite their pages being exact duplicates with simply the city/town name being changed. Any ideas where we've gone wrong?
-
I'm from Burnley originally and I've worked in Blackburn and Manchester previously but now I live and work in Dublin, Ireland It's nice to see somebody local on here.
I would suggest Social Bookmarking the new pages that you have created and I think you'll be surprised at what will happen, something so simple. Have you updated your sitemap as well?
-
Thanks for the reply Glenn. I really can't see why we would have been penalised as everything we do is above board, although it does seem as if that might be the case. I certainly think that the QDF point you make is a valid one, although it could have been around the time of the latest Panda update too, so perhaps that might have flagged up something.
I think our next step might be to recreate the pages from scratch on entirely new URLs and see if that has any effect. We will certainly try and poach some of our competitor's links too!
-
It's possible that your site has been penalized, though I don't see too many reasons why it would be in reviewing your OSE report. From a cursory investigation, I'd say you've done a great job earning the links pointing to your site... though if any trickery was involved, you may be penalized, so you may want to investigate how to get out of that trap.
I suggest you investigate the link profiles of the competitors who rank for almost all of your targeted terms. If your on-page SEO is truly better than there's, it's likely that their external link profile is earning them the rankings you desire. Learn from their strategy.
Your initial high rankings could have been related to QDF.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
Hey guys. Wondering if someone can help diagnose a problem for me. Here's our site: https://www.flagandbanner.com/ We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place! Here's the robots.txt file: User-agent: *
Intermediate & Advanced SEO | | webrocket
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/ Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/* Sitemap: https://www.flagandbanner.com/images/sitemap.xml Anyone have any thoughts as to what our problems are?? Mike0 -
Geo-Targeted Sub-Domains & Duplicate Content/Canonical
For background the sub domain structure here is inherited and commited to due to tech restrictions with some of our platforms. The brand I work with is splitting out their global site into regional sub sites (not too relevant but this is in order to display seasonal product in different hemispheres and to link to stores specific to the region). All sub-domains except EU will be geo-targeted to their relevant country. Regions and sub domains for reference: AU - Australia CA - Canada CH - Switzeraland EU - All Euro zone countries NZ - New Zealand US - United States This will be done with Wordpress multisite. The set up allows to publish content on one 'master' sub site and then decide which other sub sites to 'broadcast' to. Some content is specific to a sub-domain/region so no issue with duplicate and can set the sub-site version as canonical. However some content will appear on all sub-domains. au.example.com/awesome-content/ nz.example.com/awesome-content/ Now first question is since these domains are geo-targeted should I just have them all canonical to the version on that sub-domain? eg Or should I still signal the duplicate content with one canonical version? Essentially the top level example.com exists as a site only for publishing purposes - if a user lands on the top level example.com/awesome-content/ they are given a pop up to select region and redirected to the relevant sub-domain version. So I'm also unsure whether I want that content indexed at all?? I could make the top level example.com versions of all content be the canonical that all others point to eg. and rely on geo-targeting to have the right links show in the right search locations. I hope that's kind of clear?? Obviously I find it confusing and therefore hard to relay! Any feedback at all gratefully received. Cheers, Steve
Intermediate & Advanced SEO | | SteveHoney0 -
How many keywords should each of my pages realistically be targeting?
Hi All, I run a small bank's website and we're currently in the process of organising a site rebuild. Whilst this will be extensive and have many SEO factors to tick off, my concern now is to get a "realistic" number of keywords each of my pages should be targeting. For instance for my car loan page i've done a review on moz's keyword tool and have picked 3 or 4 good keywords - but the problem is there are realistically 7-8 that would suit. Also this is based on Bing's info only. Can anybody point me in the right direction (or have some google confirmed resource they can quote me) Cheers as always 🙂 Dave
Intermediate & Advanced SEO | | CFCU0 -
Best way to target multiple geographic locations
Hello Mozzers! If you are a service provider wanting to target geographic locations outside of the region where you're physically located, what's the best approach? For example, I have a service provider whose main market is not where they're located - they're based in Devon UK, yet main markets are London, Birmingham, Newcastle, Edinburgh. They have clients in all these cities, so I could definitely provide content relevant to each city - perhaps a page for each city detailing work and services (and possibly listing clients). However, does the lack of a physical presence (and local phone number) in these cities make such city pages virtually impossible to rank these days? Does Google require a physical presence/phone number? Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
How to add Geo Meta Tags, Dublin Core, Microformats in Word press website?
Please let me know how to add and what to include in Geo Meta Tags, Dublin Core, Microformats.
Intermediate & Advanced SEO | | Dan_Brown10 -
How Do I Generate a Sitemap for a Large Wordpress Site?
Hello Everyone! I am working with a Wordpress site that is in Google news (i.e. everyday we have about 30 new URLs to add to our sitemap) The site has years of articles, resulting in about 200,000 pages on the site. Our strategy so far has been use a sitemap plugin that only generates the last few months of posts, however we want to improve our SEO and submit all the URLs in our site to search engines. The issue is the plugins we've looked at generate the sitemap on-the-fly. i.e. when you request the sitemap, the plugin then dynamically generates the sitemap. Our site is so large that even a single request for our sitemap.xml ties up tons of server resources and takes an extremely long time to generate the sitemap (if the page doesn't time out in the process). Does anyone have a solution? Thanks, Aaron
Intermediate & Advanced SEO | | alloydigital0 -
Could a HTML <select>with large numbers of <option value="<url>">'s affect my organic rankings</option></select>
Hi there, I'm currently redesigning my website, and one particular pages lists hotels in New York. Some functionality I'm thinking of adding in is to let the user find hotels close to specific concert venues in New York. My current thinking is to provide the following select element on the page - selecting any one of the options will automatically redirect to my page for that concert venue. The purpose of this isn't to affect the organic traffic - I'm simply introducing this as a tool to help customers find the right hotel, but I certainly don't want it to have an adverse effect on my organic traffic. I'd love to know your thoughts on this. I must add that in certain cities, such as New York, there could be up to 450 different options in this select element. | <select onchange="location=options[selectedIndex].value;"> <option value="">Show convenient hotels for:</option> <option value="http://url1..">1492 New York</option> <option value="http://url2..">Abrons Arts Center</option> <option value="http://url3..">Ace of Clubs New York</option> <option value="http://url4..">Affairs Afloat</option> <option value="http://url5..">Affirmation Arts New York</option> <option value="http://url6..">Al Hirschfeld Theatre</option> <option value="http://url7..">Alice Tully Hall</option> .. .. ..</select> Many thanks Mike |
Intermediate & Advanced SEO | | mjk260 -
Best way to find broken links on a large site?
I've tried using Xenu, but this is a bit time consuming because it only tells you if the link sin't found & doesn't tell you which pages link to the 404'd page. Webmaster tools seems a bit dated & unreliable. Several of the links it lists as broken aren't. Does anyone have any other suggestions for compiling a list of broken links on a large site>
Intermediate & Advanced SEO | | nicole.healthline1