Local listings squeeze out my SERP :-(
-
Buonjourno from 9 degrees C partly cloudy wetherby UK
Ive noticed website www.davidclick.com optimised for "York wedding photographer" has been squeezed off page 1 by local listings as illustrated here:http://i216.photobucket.com/albums/cc53/zymurgy_bucket/squeezed-out-by-local-listings_zpsd506cd71.jpg
But i do have a local listing and as good as any of the others that have squeezed the site out. So my question is please:
"Why has davidclick.com been pushed out of page 1 by a raft of local listings despite this site having a reasonably well optimised local site"?
Any insights welcome:)
Grazie Tanto,David
-
You are very welcome.
-
Big Thank you
-
Big Thank you
-
Hi David,
In addition to the typical citation sources (like those listed in the Search Engine Land article), there may be additional places you can list yourself that are specific to your niche. For example, a photographer's association site, or a truly local small directory in your town...things like that. Each vertical may have opportunities that aren't going to be listed in generic lists.
Nyagoslav Zhekov has been doing some interesting pieces on citations at his site. Check this out:
Looks like that piece might be a great fit for you!
-
Thank you so much for this But may i ask whats "Additional citation building" mean?
-
Buonjourno to you, David,
Local rankings take a ton of different factors into account - not only your own factors, but the factors of all of your competitors' as well. A strong local campaign typically involves:
-
Developing an awesome, locally-optimized website
-
Inclusion in the major local business indexes
-
Additional citation building
-
Review acquisition, over time
-
In competitive arenas, link building
That's just a quick rundown, but you have to take into account things like proximity to the centroid (how close are you to the center of business), web authority, and etc. Both organic and purely local factors all work together.
And, there is also the necessity of evaluating possible problems, such as duplicate or merged listings splitting your ranking power, lack of consistency in citations, penalties for over-optimization, Google bugs and etc.
All of these things need to be taken into account, so, as you can imagine, there isn't a simple answer to your question without a couple of hours of deep digging to discover your weaknesses and potential opportunities for improvement.
I recommend that you read:
http://support.google.com/places/bin/answer.py?hl=en&answer=107528
http://www.davidmihm.com/local-search-ranking-factors.shtml
http://searchengineland.com/top-50-citation-sources-for-uk-us-local-businesses-104938
That's a lot to read, but a good place to start in your investigation of your possible problems and areas of potential improvement. Hope this helps!
-
-
that's why its important to be in the local top 10 make sure your local listing connects to your website right....
Then build the citations... I think google is gonna push the local more and more
-
Thank you, good answer & yes i'll get the smiley icon scratched
-
If you use the mozbar, its domain auth is lower then any other result on the first page.
Few things on a side note, I personally would remove the from the title tag and also work out why your manual meta description is not being pulled through to the SE.
Idea: Maybe you could add social buttons to the galleries for each photo. Idea being people from the wedding like/share/tweet pictures they are in which generates exposure and social signals. Something I think would be very shareable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Added a canonical ref tag and SERPs tanked, should we change it back?
My client's CMS uses an internal linking structure that includes index.php at the end of the URLs. The site also works using SEO-friendly URLs without index.php, so the SEO tool identified a duplicate content issue. Their marketing team thought the pages with index.php would have better link equity and rank higher, so they added a canonical ref tag, making the index.php version of the pages the canonical page. As a result, the site dropped in the rankings by a LOT and has not recovered in the last 3-months. It appears that Google had automatically selected the SEO-friendly URLs as the canonical page, and by switching, it re-indexed the entire site. The question we have is, should they change it back? Or will this cause the site to be reindexed again, resulting in an even lower ranking?
Technical SEO | | TienB240 -
Changed all product titles, lost google schema markup in listings. Temporary?
We changed all of our product titles to be way shorter and less keyword stuffed last week. Short of dropping a few places in rank for most keywords (we assume temporarily) that all went fine. What we didn't expect was to loose all the schema data in our google listings from product pages. Price, and review stars are missing. Has anyone seen this before?
Technical SEO | | monkeyevil0 -
Salvaging links from WMT “Crawl Errors” list?
When someone links to your website, but makes a typo while doing it, those broken inbound links will show up in Google Webmaster Tools in the Crawl Errors section as “Not Found”. Often they are easy to salvage by just adding a 301 redirect in the htaccess file. But sometimes the typo is really weird, or the link source looks a little scary, and that's what I need your help with. First, let's look at the weird typo problem. If it is something easy, like they just lost the last part of the URL, ( such as www.mydomain.com/pagenam ) then I fix it in htaccess this way: RewriteCond %{HTTP_HOST} ^mydomain.com$ [OR] RewriteCond %{HTTP_HOST} ^www.mydomain.com$ RewriteRule ^pagenam$ "http://www.mydomain.com/pagename.html" [R=301,L] But what about when the last part of the URL is really screwed up? Especially with non-text characters, like these: www.mydomain.com/pagename1.htmlsale www.mydomain.com/pagename2.htmlhttp:// www.mydomain.com/pagename3.html" www.mydomain.com/pagename4.html/ How is the htaccess Rewrite Rule typed up to send these oddballs to individual pages they were supposed to go to without the typo? Second, is there a quick and easy method or tool to tell us if a linking domain is good or spammy? I have incoming broken links from sites like these: www.webutation.net titlesaurus.com www.webstatsdomain.com www.ericksontribune.com www.addondashboard.com search.wiki.gov.cn www.mixeet.com dinasdesignsgraphics.com Your help is greatly appreciated. Thanks! Greg
Technical SEO | | GregB1230 -
Categories in Places Vs Local
Say you are listed with both Google places and Google Local. Places still allows custom categories, while Local limits you to preset categories. Which is the better strategy: to build service pages following custom services available in Places, or build out service pages following the (allowed) preset categories in Local.
Technical SEO | | waynekolenchuk0 -
Homepage disappeared from Google Serp
I redirected my domain using this code in .htaccess : RewriteCond %{HTTP_HOST} ^xxxx.com
Technical SEO | | digitalkiddie
RewriteRule (.*) http://www.xxxx.com/$1 [R=301,L]
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]/)index.(html?|php)(?[^\ ])?\ HTTP/
RewriteRule ^(([^/]/)*)index.(html?|php)$ http://www.xxxx.com/$1 [R=301,L]</ifmodule> A day after I did it, got an error in GWMT "Google can't find your site's robots.txt" and my homepage disappeared from the result pages. When I try to open Google cache of the homepage I got an error 404. I generated new robots.txt, uploaded it , now the error doesnt show but still my homepage is not in the serps. Its been 3 days. What should I do ? Thanks in advance "Google can't find your site's robots.txt" error? - Pro ...0 -
Siemap.xml appearing in SERP
My sitemap.xml was appearing in the google serp for certain keywords (& not my actual page onsite). Please see image. I recently blocked my sitemap.xml with a robots.txt exclusion but now the sitemap.xml is not getting crawled in google webmaster. Is this the correct method of excluding the sitemap.xml for the serp? User-agent: * Disallow: /assets/cache/ Disallow: /assets/docs/ Disallow: /assets/export/ Disallow: /assets/import/ Disallow: /assets/modules/ Disallow: /assets/plugins/ Disallow: /assets/snippets/ Disallow: /manager/ Disallow: /sitemap.xml Sitemap: http://bryansryan.ie/sitemap.xml Any suggestions what should be done here? thanks. nQo2g.png
Technical SEO | | Socialdude0 -
What's the website that analyzes all local business submissions?
I was recently looking at a blog post here or a webinar and it showed a website where you could see all of the local sites (yelp, Google places) where your business has been submitted. It was an automated tool. Does anyone remember the name of the site?
Technical SEO | | beeneeb0 -
Does having the local area name in a domain effect your results when branching out?
We have a domain which performs well within the local search and has got good authority and trust but we are now moving further afield to rank for keywords country wide. Our current domain contains our local area, does this effect your chances of ranking for broader searches? You don't seem to see many general searches bring domains up with the location keywords within their domain.
Technical SEO | | DragonsDesign0