Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can you set up a Google Local account under a PO Box?
-
I have a client that wants a Google local listing in a town he serves but does not have a physical location. Is it an issue to share an address with an existing company? Is is it better to use a P.O. Box? or is there a forwarding address company? Is this considered a black hat Local SEO tactic?
-
Along those same lines, does anyone else know of any good services that will provide physical business addresses in certain locations? I know of some virtual office providers, but only found ones that do major cities, not smaller ones.
-
Here are the quality guidelines for Google Local Listings. Related to PO Boxes, Google says "Do not create listings at locations where the business does not physically exist. P.O. Boxes are not considered accurate physical locations. Listings submitted with P.O. Box addresses will be removed."
http://www.google.com/support/places/bin/answer.py?hl=en&answer=107528&rd=1
However, they do now offer a way to list service areas for a business. Information about how to do this is available on the Google page for service areas.
http://www.google.com/support/places/bin/answer.py?hl=en&answer=177103
-
Cant go the PO box route. They are cracking down on local listings. You need a physical address. Its not a problem to share the address if you have different suite # or something in. There are also some companies out there that you can pay to give you a virtual office address in major cities, and you could technically use that. Remember that you will need to verify the listing, and sometimes Google will not let you do it via the phone, they will only do the postcard route, so you will need to monitor the mail to that location.
-
http://www.google.com/support/forum/p/maps-archive/thread?tid=4fa32594aadb2e9f&hl=en
It looks like you'll have to use the real address!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can the disavow tool INCREASE rankings?
Hi Mozzers, I have a new client who has some bad links in their profile that are spammy and should be disavowed. They rank on the first page for some longer tail keywords. However, we're aiming at shorter, well-known keywords where they aren't ranking. Will the disavow tool, alone, have the ability to increase rankings (assuming on-site / off-site signals are better than competition)? Thanks, Cole
White Hat / Black Hat SEO | | ColeLusby0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Can I 301 redirect old URLs to staging URLs (ex. staging.newdomain.com) for testing?
I will temporarily remove a few pages from my old website and redirect them to a new domain but in staging domain. Once the redirection is successful, I will remove the redirection rules in my .htaccess and get the removed pages back to live. Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
A Branded Local Search Strategy utilizing Microsites?
Howdy Moz, Over and over we hear of folks using microsites in addition to their main brand for targeting keyword specific niches. The main point of concern most folks have is either in duplicate content or being penalized by Google, which is also our concern. However, in one of our niches we notice a lot of competitors have set up secondary websites to rank in addition to the main website (basically take up more room on the SERPS). They are currently utilizing different domains, on different IPs, on different servers, etc. We verified because we called and they all rang to the same competitors. So our thought was why not take the fight to them (so to speak) but with a branding and content strategy. The company has many good content pieces that we can utilize, like company mottos, missions statements, special projects, community outreach that can be turned into microsites with unique content. Our strategy idea is the take a company called "ACME Plumbing" and brand for specific keywords with locations like sacramentoplumberwarranty.com where the site's content revolves around plumber warranty info, measures of a good warranty, plumbing warranty news (newsworthy issues), blogs, RCS - you get the idea...and send both referral traffic and link to the main site. The ideal is to then repeat the process with another company aspect like napaplumbingprojects.com where the content of the site is focused on cool projects, images, RCS, etc. Again, referring traffic and link juice to the main site. We realize that this adds the amount of RCS that needs to be done, but that's exactly why we're here. Also, any thoughts of intentionally tying in the brand to the location so you get urls like acmeplumbingsacarmento.com?
White Hat / Black Hat SEO | | AaronHenry1 -
Can a Page Title be all UPPER CASE?
My clients wants to use UPPER CASE for all his page titles. Is this okay? Does Google react badly to this?
White Hat / Black Hat SEO | | petewinter0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0