SEO for location outside major city
-
Hello,
I'm hoping to get some opinions on optimising a site for a client based 30 minutes outside of Dublin. Obviously there is a higher search volume for "x in Dublin" than "x in small town". What do you think the best strategies are for incorporating the "Dublin" into keywords? For example is it OK to use phrases like "x near Dublin" or "x in Greater Dublin", or do you think this is a bit misleading?
The client in question sells good online, so the customer wouldn't physically have to visit the store.
Thanks!
-
Glad to be of help, Alice, and good luck!
Miriam
-
Thanks for your replies Adam and Miriam.
You have confirmed what I was thinking already - that I could optimise for phrase like "x near Dublin" but it will be very hard to rank higher than competitors who are physically based within the city limits.
-
Hello Alice,
Were you attempting to do Local SEO for this business, then the answer would be 'no, you should not attempt to optimize for Dublin as your main target.' The reason for this is that local results are based on the validity of the physical address and local phone number.
Your case sounds somewhat different, however. If no customers come to the business, then it won't qualify as a local business in the eyes of Google. Because of this, yes, you can certainly optimize for phrases like 'near Dublin' if you feel this will assist the client in some way. But, what you should not expect from the work is that you will be able to outrank true local businesses, locally, who have physical locations in Dublin and are able to take advantage of the different local options like Google+ Local.
Hope this helps!
Miriam
-
Personally I don't see any problem with you identifying your business as Dublin based. You are clearly close enough to the city to service it so I don't see a problem here.
You often see businesses citing their location as London or as having a London office which is technically not in London but in one of the neighbouring counties such as Surrey, Middlesex, Essex or Kent. As long as you are close enough to the main city then I see no problem in claiming your location as that main city. Others may disagree but that's what I think and in practice, many businesses already do this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best seo benefit location ( main page text or h1 , h2)?
i have learned that h1 has more value than h2 and h2 has more than h3, but lets say if i want to place my keywords in there. should i include them in the main body or should take advantage of header tags?
White Hat / Black Hat SEO | | Sam09schulz0 -
Would this be duplicate content or bad SEO?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Negative SEO yes/no?
We receive links from fake websites, these website are copy's from real websites that link to us, but sometimes the links are changes, as for example one link is called 'tank weapon with hitler', we are a insurance comparison website (a bit of topic). The real websites that link to us are copied and placed on .ga .tk etc domains: For example: wahlrsinnsa.ga, loungihngsa.ga, pajapritosa.cf, rgeitsportsa.cf, sospesvoasa.tk I received spam links on other domains with comments spam etc, this doesnt really work, but in this case we really suffer in our rankings (from position 1 to 5 etc). Not sure if this is negative SEO and if this is really the reason we lost some rankings, but it's a bit of a coincidence the domains come in google webmaster in the same period we suffer a downgrade in our rankings. My question: Is this negative SEO, or is it something automatic. And do I need to disavow the links/domains? The real versions of the websites (on other domains with .nl) give the website autority.
White Hat / Black Hat SEO | | remkoallertz0 -
Old subdomains - what to do SEO-wise?
Hello, I wanted the community's advice on how to handle old subdomains. We have https://www.yoursite.org. We also have two subdomains directly related to the main website: https://www.archive.yoursite.org and https://www.blog.yoursite.org. As these pages are not actively updated, they are triggering lots and lots of errors in the site crawl (missing meta descriptions, and much much more). We do not have particular intentions of keeping them up-to-date in terms of SEO. What do you guys think is the best option of handling these? I considered de-indexing, but content of these page is still relevant and may be useful - yet it is not up to date and it will never be anymore. Many thanks in advance.
White Hat / Black Hat SEO | | e.wel0 -
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
What could go wrong? SEO on mobile site is different than desktop site.
We have a desktop site that has been getting worked on over the year regarding improving SEO. Since the mobile site is separate, the business decided to not spend the time to keep it updated and just turned it off. So any mobile user that finds a link to us in search engines, goes to a desktop site that is not responsive. Now that we're hearing Google is going to start incorporating mobile user friendliness into rankings, the business wants to turn the mobile site back on while we spend months making the desktop site responsive. The mobile site basically has no SEO. The title tag is uniform across the site, etc. How much will it hurt us to turn on that SEO horrid mobile site? Or how much will it hurt us to not turn it on?
White Hat / Black Hat SEO | | CFSSEO0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Is it outside of Google's search quality guidelines to use rel=author on the homepage?
I have recently seen a few competitors using rel=author to markup their homepage. I don't want to follow suit if it is outside of Google's search quality guidelines. But I've seen very little on this topic, so any advice would be helpful. Thanks!
White Hat / Black Hat SEO | | smilingbunny0