Best approach for a client with another site for the same company
-
I have a client who has an old website and company A handles the SEO campaign for this site.
My client wanted us to create a new website with unique content for the same company aiming to double his chances of ranking on the 1st of SERP's and eventually dominating it.
So we created the new site for him and handled it's SEO campaign. So far we are ranking decently on the search engines but we feel like we could do better. The site we are optimizing for him uses the same company, tracking number and a virtual address in the same city.
Do you think Google has a problem with this set up?
We have listed the new site in the citation directories but I'm worried that we are sending google mixed signals. The company has two listing on each directories, one for the old site and another for the new site.
Another thing, Google+ Local for the new site is created and verified but is not showing up in local pack.
What is the best way to approach this mess?
We are looking into ranking for both local & organic results. -
Hi Adam,
Good thinking. Hope it works out well. Situations like these can be layers deep and murky - tough to sort out. Wishing you the best!
-
Thank you Miriam.
You are right, I would need to have a heart to heart talk with my client to sort out these issue.
-
First, if the tracking number that you use for the citations vary from the number listed on the new website, the old website, and/or the old citations this is a problem. Trust me, I am currently handling a client who wanted to rank locally for a city that they did not have a business location at. Without consulting us, they set up an ad for that city with a tracking number BUT it was still associated with the same business name and address. I don’t know if you are aware but these different directories scrape information from each other and as a result new listings are created with inconsistent business information. It's not pretty and not only is this confusing to a potential client looking for your business but you have significantly decreased your chances at appearing in Google’s 7-pack.
Here are a few resources that I refer back to but I would start with Google Places Guidelines---
http://getlisted.org/resources/why-citations-are-important.aspx
http://www.davidmihm.com/local-search-ranking-factors.shtml
- Pay attention to questions 5 & 7
http://www.seomoz.org/blog/40-important-local-search-questions-answered
- This is a follow up to a mozinar that is totally worth watching too.
Hope this helps!
-
Hi Adam,
We need to pedal back here to this:
"The site we are optimizing for him uses the same company, tracking number and a virtual address in the same city. "
and this:
"We have listed the new site in the citation directories...Another thing, Google+ Local for the new site is created and verified but is not showing up in local pack."
You have some root issues going on here. Both virtual offices and call tracking numbers for Local businesses are taboo in Google's local products. For legitimate participation in Google+ Local, your client needs to have:
-
Face-to-face transactions with customers either at the place of business (like a restaurant) or at the customers' locations (like a plumber).
-
A unique, physical street address (not a virtual office, P.O. box or shared address)
-
A unique local area code phone number (not a toll free, call tracking or shared number).
If the client cannot meet all 3 of the above criteria, then he is not suitable for inclusion in Google's local products, and he is not appropriate for a local citation building campaign.
So, this is actually the issue that needs to be sorted out first. Whether your client's failure to show up in the local results is due to a penalty stemming from Google considering the listing to be spam or stems from other issues is sort of moot, here, because the business model you are describing does not sound truly local to me.
For further reading, I recommend that both you and the client study Google's Places Quality Guidelines:
http://support.google.com/places/bin/answer.py?hl=en&answer=107528
You will see precisely why the business model you are describing is problematic in the above guidelines.
Regarding the call tracking phone number element, read:
http://searchengineland.com/for-local-seo-lack-of-call-tracking-solution-spawns-cloaking-70198
Read that post and all of the links in it, as well, for full information on the history of issues surrounding call tracking numbers in the world of Local SEO.
My feeling is that a wrong estimation of this client's opportunities may have been made here and that Local SEO is being pursued in vain until he can meet those requirements. Having 2 websites now in existence for the client is only going to compound the issues. I never recommend double sites for local business owners, but where there is some reason why they feel they MUST have more than one website, I advise them to make sure that their NAP (name-address-phone) is only published on one of the websites. Everything hangs on NAP in Local and if you're telling Google that both www.johntheplumber.com and www.sandiegoplumber.com are located at 123 First St. San Diego, CA., this will confuse Google and potentially lead to duplicated listings and ranking drops. Total clarity and consistency of data are vital to any Local SEO campaign, but in this case, your first step is going to be to assess the client's actual business model and then determine whether they have a legitimate place in the local index or need to pursue purely organic SEO due to a lack of the elements essential to local inclusion.
Hope this helps!
-
-
Hi Amber,
We use the same address but different location which is a virtual address then a local tracking number for the citations.
Do you think Google could tell if it's a virtual address or not and if they could, is it going to have a negative effect in our rankings?
Also, what I meant by "local pack" is the local map listings that shows up together with the organic search.
As per Google maps, I searched my client's company name and we aren't showing up in there either.
-
Our company also have two websites, we launch the second website for different reasons than your client, We sell power tools and power tool parts in the second website we want focus more in parts.
The phone number and address it's the same in both sites. We also have many products that are available in both sites.
When we decided to launch the second website we thought that having the same address and phone number could be a problem for our rank. Now 4 months later I don't think it cause any problem. The first website that we had for years still ranking well and improving the rank, our new website is also doing well and progressing fast.
We tried to avoid duplicate content so the product descriptions, about us page, and blog entries are different for each website. Also we didn't include the new site in the local directories.
In my humble opinion as long you have different content in each website the phone number and address wont be a problem.
-
Hi Adam,
When you say you have listed the new site in the citation directories are you using the same business name, address and phone number? What is the difference in the business information on these listings?
From my understanding, if these citations have mixed information for the same business you have set yourself up for duplicate listings. Google uses the information from these sources to validate the business name, address, phone number and business details. If Google is seeing two different listings for the same company on these citation sources how will it know which one to trust enough to show in it's local/map results?
In the end I think your ranking power is going to be (or is being) divided greatly for Local.
Also, when you say the G+Local listing that you created for the new site is not showing up in the local pack- what are you considering the "local pack" or are you saying that the listing is not even appearing when you search for it in Maps?
Best,
Amber
-
Hi Adam,
This is my first time hearing about this approach. In my opinion, it seems like your client is trying to game the search engine by creating 2 sites and hoping one of them will rank better. I believe search engines will not like it since your client is trying to game the system. I don't think search engines will favor anyone gaming the system.
Why don't your client spend the time and money creating and ranking the new site on the old site?
They can use those time and money to build more quality links and producing more contents then they will definitely rank higher instead of start a new website and doing everything from scratch.
my 2 cents. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best SEO Strategy
Hi fellow Mozers: I have a question about strategy. I have a client who is a major real estate developer in our region. They build and sell condominiums and also built and manage several major rental apartments. All rental properties have their own websites and there is also a corporate website, which has been around for many years and has decent domain authority (+/- 40). The original intent of the corporate website was to communicate central brand positioning points, attract investors and offer individual profiles of all major properties. My client is interested in developing an organic search strategy which will reach consumers looking to rent apartments. Typical search strings would include the family whose core string would be 'apartments in Baltimore.' (Currently, the client runs PPC for each one of their properties. This is expensive and highly competitive.) In doing research, we've found that there are two local competitors who are able to break on to Page 1 and appear beside the National 'apartment search guides' who dominate the Page 1 SERPS (like apartments.com). The two local competitors have websites of either the same or lower authority than our client's; one has a better link profile, the other is comparable. Here's our problem: our local competitors only build and manage apartments. So, then, the home pages and all the content of their sites ONLY talk about apartment rental related information. Our client's apartment business is actually larger in scope than either local competitor but is only one of their major real estate verticals. So my question is this: if we want to build out a bunch of content which will rank competitively with our local competition, are we better off creating a new area of the corporate site, creating targeted content and resources appropriate for apartment seekers OR would we be better off creating an entirely new site, just devoted to the same? I'm wondering if a new section will ever rank well against competitors whose root domains actually feature content which is only rental related? Likewise, I'm wondering whether we'd be giving up too much, in terms of authority, by creating an entirely new site? I've also only found examples in the industry where an entirely new site was created, so it makes me question the strategy of building out a rental-specific section of a site which also contains information about their condo business. For instance, the Related Companies are a huge builder in the East; they have a corporate site and a site called https//relatedrentals.com . Any feedback would be greatly appreciated!
Intermediate & Advanced SEO | | Daaveey0 -
How much SEO damage would it do having a subdomain site rather directory site?
Hi all! With a coleague we were arguing about what is better: Having a subdomain or a directory.
Intermediate & Advanced SEO | | Gaston Riera
Let me explain some more, this is about the cases: Having a multi-language site: Where en.domain.com or es.domain.com rather than domain.com/en/ or domain.com/es/ Having a Mobile and desktop version: m.domain.com or domain.com rather than domain.com/m or just domain.com. Having multiple location websites, you might figure. The dicussion started with me saying: Its better to have a directory site.
And my coleague said: Its better to have a subdomain site. Some of the reasons that he said is that big companies (such as wordpress) are doing that. And that's better for the business.
My reasons are fully based on this post from Rand Fishkin: Subdomains vs. Subfolders, Rel Canonical vs. 301, and How to Structure Links for SEO - Whiteboard Friday So, what does the community have to say about this?
Who should win this argue? GR.0 -
New site. How important is traffic for a new site? And what about domain age?
Hi guys. I've been building a new site because i've seen a real SEO opportunity out there. I'm a mixing professional by trade and so I wanted to take advantage of SEO to help gain more work. Here's the site: www.signalchainstudios.co.uk I'm curious about domain age. This site fairly well optimised for my keywords, and my site got pretty good content on it (i think so anyway). But it's no where to be seen on the SERP's (link at all). Is this just a domain age issue? I'd have though it might be in the top 50 because my site's services are not hard to rank for at all! Also what about traffic? Does Google want to see an 'active' site before it considers 'promoting' it up the ranks? Or are back links and good content the main factor in the equation? Thanks in advance. I love this community to bits 🙂 Isaac.
Intermediate & Advanced SEO | | isaac6631 -
3 Wordpress sites 1 Tumblr site coming under 1domain(4subdomains) WPMU: Proper Redirect?
Hey Guys, witnessSF.org (WP), witnessLA.org(Tumblr), witnessTO.com(WP), witnessHK.com(WP), and witnessSEOUL.com(new site no redirects needed) are being moved over to sf.ourwitness.com, la.ourwitness.com and so forth. All under on large Wordpress MU instance. Some have hundreds of articles/links others a bit less. What is the best method to take, I understand there are easy redirects, and the complete fully manual one link at a time approach. Even the WP to WP the permalinks are changing from domain.com/date/post-name to domain.com/post-name? Here are some options: Just redirect all previous witinessla.org/* to la.ourwitness.org/ (automatic direct all pages to home page deal) (easiest not the best)2) Download Google Analytics top redirected domains about 50 urls have significant ranking and traffic (in LA's sample) and just redirect those to custom links. (most bang for the buck for the articles that rank manually set up to the correct place) 3) Best of the both worlds may be possible? Automated perhaps?I prefer working with .htaccess vs a redirect plugin for speed issues. Please advise. Thanks guys!
Intermediate & Advanced SEO | | vmialik0 -
Following Penguin 2.0 hit in May, my site experienced another big drop on August 13th
Hi everyone, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update in May. This was the first significant drop that the site has experienced since 2007, and I was initially concerned that the new website design I released in March was partly to blame. On further investigation, many spammy sites were found to be linking to my website, and I immediately contacted the sites, asked for the removal of the sites, before submitting a disavow file to Google. At the same time, I've had some great content written for my website over the last few months, which has attracted over 100 backlinks from some great websites, as well as lots of social media interaction. So, while I realise my site still needs a lot of work, I do believe I'm trying my best to do things in the correct manner. However, on August 11th, I received a message in Google WMTs : Googlebot found an extremely high number of URLs on your site I studied the table of internal links in WMTs and found that Google has been crawling many URLs throughout my site that I didn't necessarily intend it to find i.e. lots of URLs with filtering and sorting parameters added. As a result, many of my pages are showing in WMTs as having over 300,000 internal links!! I immediately tried to rectify this issue, updating the parameters section in WMTs to tell Google to ignore many of the URLs it comes across that have these filtering parameters attached. In addition, since my access logs were showing that Googlebot was frequently crawling all the URLs with parameters, I also added some Disallow entries to robots.txt to tell Google and the other spiders to ignore many of these URLs. So, I now feel that if Google crawls my site, it will not get bogged down in hundreds of thousands of identical pages and just see those URLs that are important to my business. However, two days later, on August 13th, my site experienced a further huge drop, so its now dropped by about 60-70% of what I would expect at this time of the year! (there is no sign of any manual webspam actions) My question is - do you think the solutions I've put in place over the last week could be to blame for the sudden drop, or do you think I'm taking the correct approach, and that the recent drop is probably due to Google getting bogged down in the crawling process. I'm not aware of any subsequent Penguin updates in recent days, so I'm guessing that this issue is somehow due to the internal structure of my new design. I don't know whether to roll back my recent changes or just sit tight and hope that it sorts itself out over the next few weeks when Google has more time to do a full crawl and observe the changes I've made. Any suggestions would be greatly appreciated. My website is ConcertHotels.com. Many thanks Mike
Intermediate & Advanced SEO | | mjk260 -
Best way to SEO crowdsourcing site
What is the best way to SEO a crowdsourcing site? The websites content is entirely propagated by the user
Intermediate & Advanced SEO | | StreetwiseReports0 -
Sites banned from Google?
How do you find out sites banned from Google? I know how to find out sites no longer cached, or is it the same thing once deindexed? As always aprpeciate your advice everyone.
Intermediate & Advanced SEO | | pauledwards0 -
Targetting site in 3 countries
I have read the seomoz post at - http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday before asking the question We recieved a query from one of our client regarding targetting his site in 3 different countries namely - US,UK and Australia. Specifically, he has asked us- 1. Whether i should buy ccTLD like - www.example.co.uk www.example.com.au www.example.com and write unique content for each of the above. or 2. or go for subfolder approach www.example.com/UK www.example.com/AU will it affect SEO if the subfolders are in CAPS. Would like to have advice of moz community on what advice will be the best. Thanks
Intermediate & Advanced SEO | | seoug_20050