Local Landing Pages struggling with rankings although I've done most things needed. Any idea?
-
Hi Mozzers,
I am wondering if someone could advise if there's anything obvious here as to why my local landing pages suck ranking wise even though I have done all of the following. http://goo.gl/Lr4HXa
I am trying to rank for Garden tool hire Bristol on my landing page. Main category page is garden tool hire
- Consitant NAP - Citations.
- Local branch address on Page , in title tag, H1 tag and the address is in on page content which is unique.
- Schema.org has been set up with address in this aswell etc.
- Pagination set up and view all page has concanical tag pointing to page 1
- Speed not an issue as this is a fast site.
- Currently all the product links on the page are H3 tags but I've seen this on lots of other sites.
- All my NAP Citations point to the parent branch pages although I don't have any individual deep links pointing to this page.
- Unique Content
I currently don't have internal links to relevant articles on my blog page as I have those on my main category landing page as you can see here - http://goo.gl/sO9A9U but I can add these as well to all my location specific landing pages if you think it would help.
Any thoughts greatly appreciated
Pete
-
Content needs to be vastly different, not just slightly varied.
This can be painful, and take a lot of time and creativity to figure out how to write the same 300-500 words in a different way.
I wrote a personal blog post not too long ago on ways to write content for location pages: http://doyledigital.com.au/content-for-location-pages/
-
My pleasure, Pete, and don't overlook the nice medium of the blog for continuing to showcase your involvement in various cities. I think Google is still very much in love with fresh content
-
Many thanks Andy and Miriam, I think you may well be right. Whilst this technique was extremely successful a couple of years back for me Google changes have stopped this method performing as well so it may be perceived as spammy in their eyes. Whilst I've got unique content etc , I could still be getting affected by some form of algorithmic penalty
Will have to look at how to restructure things.
thanks
Pete
-
Hi Peter,
My own preference for doing this type of Local SEO/copywriting is to structure sites like this:
-
A page for every office or a page for every city served
-
A page for every service
I feel like once you get into trying to cover every possible city/service combo on landing pages, there can be some danger of creating thin or throwaway content. Unless your city A lawn mowing service is somehow totally different than your city B lawn mowing service, you probably shouldn't be creating these pages. Instead, have a page for city A, a page for city B and a page for lawn mowing.
I'm not saying that what I'm describing is the only way to do this - just that it's my personal preference for small-to-medium businesses.
-
-
My competitors do it and they do okay ranking wise hence I thought it may be page specific issue.
thanks for your input though, I will try and see what can be changed.
Pete
-
although the products are the same on the pages.
This is what is going to prevent you from reaching these goals. Google isn't going to rank two pages the same with just a difference in basics. It isn't enough for them.
There is usually a lot more to look at in these circumstances if you want to start ranking for location phrases too, because what you are doing there alone isn't enough. This is more likely to bring down problems than help you gain rankings.
-Andy
-
Hi Andy,
Yes I see what you mean but I have a main category page and then branch specific landing pages of those category pages each with unique content on each of them.
The title tags etc all have the location in there to make them unique although the products are the same on the pages.
I don't see how else one can rank for specific location pages for their categories any other way as it's to competitive to try and compete with the majors so local search for each branch is an better way forward if I can get it work.
thanks
Pete
-
Hi Pete,
Just a very quick observation, you might be running into pages that are quite similar:
- http://www.bestathire.co.uk/branches/bristol-tool-hire-shop
- http://www.bestathire.co.uk/garden-tools-bristol
Then you also have...
Pages 2 & 3 appear to be exactly the same, but with a different title, heading, etc. Google won't thank you for that.
I would spend some time thinking about the phrases you wish to rank for and making sure your pages are all suitably different in content and offerings.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do when half of my pages aren't being viewed?
My site is roughly 1000 pages. I've begun refreshing older content. I noticed about half of my pages have no incoming traffic. Should I look at combining some of these pages and 301 redirecting the former links to that new "bigger" page and then having my home page show that new consolidated content? They don't have good back links either. Example layout now: Home Page - Restaurants [show list of cuisines] - User clicks on Italian [show list of all Italian restaurants] - Choice 1 - Choice 2 Even though my main page is seen by about 100,000 people a month, it doesn't seem like anyone is interested in going down that path so none of the restaurants are clicked. How could I improve the user interface/experience and incorporate best Google practices? Thanks, Steve
Technical SEO | | recoil0 -
When we have 301 page is a Rel=Canonical needed or should we make 1 Noindex?
Hi, When we have a page as 301 (Permanent Redirect) is a Rel=Canonical needed or should we make 1 Noindex? Example http://www.Somename.com/blog/138760 when clicked goes to http://www.Somename.com/blogs/whenittyam Should i noindex the below pages http://www.Somename.com/blog/138760 and add Rel=Canonical Thanks
Technical SEO | | mtthompsons0 -
Wrong pages ranking for keywords?
I've just done a search using east england document storage site:ukdocumentstorage.com and found our specific east england document storage page is ranking really low down in our list of pages compared to others. How could this happen?
Technical SEO | | janc0 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
Blog Ranking NOT home page main website?!
Hi, Our Blog (http://blog.thailand-investigation.com) is ranking for some of our major keywords but not our home page (http://www.thailand-investigation.com)!? Our blog is WordPress and our main website is HTML. It seems like the search engines consider that they are 2 separate websites!? When I check the incoming links to our website, I get also the blog links!!!??? Is it normal? Do I have to build a relation of some kind or write some code saying that it is our Blog... I don't know! I'm not a SEO specialist or even a webmaster. I'm a small business owner and take care on my website. I created by myself but never learned! So, please help! Thanks
Technical SEO | | MichelMauquoi0 -
Google doesn't rank the best page of our content for keywords. How to fix that?
Hello, We have a strange issue, which I think is due to legacy. Generally, we are a job board for students in France: http://jobetudiant.net (jobetudiant == studentjob in french) We rank quite well (2nd or 3rd) on "Job etudiant <city>", with the right page (the one that lists all job offers in that city). So this is great.</city> Now, for some reason, Google systematically puts another of our pages in front of that: the page that lists the jobs offers in the 'region' of that city. For example, check this page. the first link is a competitor, the 3rd is the "right" link (the job offers in annecy), but the 2nd link is the list of jobs in Haute Savoie (which is the 'departement'- equiv. to county) in which Annecy is... that's annoying. Is there a way to indicate Google that the 3rd page makes more sense for this search? Thanks
Technical SEO | | jgenesto0 -
Why this page doesn't get indexed?
Hi, I've just taken over development and SEO for a site and we're having difficulty getting some key pages indexed on our site. They are two clicks away from the homepage, but still not getting indexed. They are recently created pages, with unique content on. The architecture looks like this:Homepage >> Car page >> Engine specific pageWhenever we add a new car, we link to its 'Car page' and it gets indexed very quickly. However the 'Engine pages' for that car don't get indexed, even after a couple of weeks. An example of one of these index pages are - http://www.carbuzz.co.uk/car-reviews/Volkswagen/Beetle-New/2.0-TSISo, things we've checked - 1. Yes, it's not blocked by robots.txt2. Yes, it's in the sitemap (http://www.carbuzz.co.uk/sitemap.xml)3. Yes, it's viewable to search spiders (e.g. the link is present in the html source)This page doesn't have a huge amount of unique content. We're a review aggregator, but it still does have some. Any suggestions as to why it isn't indexed?Thanks, David
Technical SEO | | soulnafein0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0