Can one business operate under more than one website?
-
Is it possible for a business to rank organically for the same keyword multiple times with different web addresses? Say if I sell car keys and I wanted to rank for "buy new car keys" and I set up two different website say ibuycarkeys.com and carkeycity.com and then operate under both of these, would Google frown upon this?
-
My pleasure, Carla! So glad to help.
-
Hi Miriam,
I really appreciate all the references and info. This is the kind of stuff that makes debates end quickly
As for EMD, we are actually thinking about rebranding since when we started our company EMD had all the benefits.
Thanks for all the tips
Carla
-
Hi Carla,
Here's a shortie-but-goodie from Barry Schwartz on this topic:
http://www.seroundtable.com/google-one-site-locations-15454.html
Note the quote from Goolger, John Mu, on that one.
http://www.seroundtable.com/google-one-site-locations-15454.html
And here is Google and Your Business forum Top Contributor Linda Buquet's educated opinion on this:
What the client needs to understand is that:
-
Their local business can have only 1 Google+ Local listing, linking to a single domain. If Google finds the business name attached to multiple websites, Google will be confused and lack 'trust' in the data cluster they create for the business. Similarly, if any other element of the business' core NAP (name-address-phone) is found on more than one website, this will cloud Google's understanding of the business. This can lead to accidental duplicate listing creation and ranking problems.
-
Your client will be splitting up their authority across multiple domains instead of building great authority on a single domain, where every action taken goes toward strengthening the brand.
-
Let's not forget Google's big recent targeting of EMDs. Though we didn't see drastic effects from this in Local, we all have received fair warning from the EMD penalty that Google is down on thin content, exact match domain sites. What I see in Local is a single business owner publishing thin and duplicate content on a set of domains like sanfranciscoplumber.com, sanjoseplumber.com, sanrafaelplumber.com, etc., and I believe Google has made it pretty clear that this type of activity is under scrutiny. I think there are definite risks associated with a multi-site approach.
-
And let's consider how this looks to the most important audience - potential customers. All local businesses must work to develop an authoritative, memorable brand that comes to mind instantly when a service is needed. If my hot water heater stops working, what is that brand, that domain name? Is it sanjoseplumber.com, sanrafaelplumber.com??? I can't remember. But if it's StanislovPlumbing.com - an honest representation of the business name that matches branding - and I've used their services before, my chances of remembering/recognizing them is much higher. To me, this is a very strong argument against splitting up brand/authority across multiple sites.
These are just a few reasons. I could likely come up with more, but honestly, I can't think of a single instance in which I would recommend that a small local business owner try to operate multiple websites. It is completely possible to rank well for a variety of service/geo terms with a single website with the right approach. Good luck in educating your client about this, Carla. Feel free to share this post with him, as well as the links I've provided.
-
-
Hi Miriam,
I have a client who has a local business and really wants me to create multiple websites to go after different keywords. I have advised him not to do this but he keeps insisting. Can you recommend any great articles from a well know source talking about this? Does Matt Cutts have anything on this issue?
Thanks
Carla
-
Good discussion going on here, and thought I would add, if the business is Local in nature, rather than virtual, I strongly recommend against a multi-site approach. I wanted to clarify this in case members take a look at this thread and own a local business.
-
Hi Greg
Yes, you are right and we actually put this forward to the client, there was actually more benefits to doing it this as we saw the situation. If nothing more than the fact its all under the one "domain"
But after we established with the client that we would need two sites, be it two separate domains or using sub-domains, the client actually insisted that they wanted the two domains option.
We found it difficult to put a concrete case in front the of the customer to justify going down the sub-domain route considering they were so adamant to use two separate domains.
Customer in king, they pay the bills, we rolled with them on the two domains ...
I would be interested though how the sub-domain option would have worked to be fair, we had planned to use the main domain as more of an information portal , utilize that for targeting their key-phrases, it would have left us a lot more room to be more versatile with actual content as with all e-Commerce stores.
Its defiantly the more conventional approach people take ... but their is nothing conventional with this client
John
-
Great insight John, thanks for the words of wisdom.
What are you thoughts regarding sub-domains? You could have also created a sub-domain and geo targeted a specific country for each.
Example:
www.website.com (Global)
uk.website.com (UK Only)
au.website.com (Australia Only)
I'm not sure if sub-domains or new domains have more weight than the other, but it makes sense to keep the site and brand as a whole intact rather than creating new domains and just geo target each for their respective country/audience.
Greg
-
Well on the contrary, I am not so sure the term passe comes to mind ... Different situations call for different approaches, you got to be versatile in this game, there is no fire sure remedy for each and every client or project.
We have an e-Commerce client who want to target both the UK and Irish markets with immediate effect, and after a lot of consideration ,and due to their nature of business, we and they decided to go with two e-Commerce stores to two separate domains (.ie & .co.uk).
In all senses of purpose, they are the same websites, well selling the exact same products. Same site structure, same CSS, same JS etc etc.
The approach we took was to vary the product descriptions, have different URLS for the same product on both sites, have different image names and alt tags on both sites ... create different topics about the product categories on both websites, and lastly for the onsite optimisation ,we created two separate blogs on both websites with totally unique content (no cross over for the blogs).
Off page ,we started two separate social campaigns along with two separate online marketing campaigns supported by two separate ad words campaigns.
Granted there was more work involved than creating one website and target two countries, but with regards to budget for the client, it was relatively similar doing it the way we did it.
So the results ,well its rare it happens but we where pretty happy with the results so far 3 months down the line, both sites are fully indexed and ranking well in the SERP for their respective countries ...
The major key-terms we need both to rank for last week broke into the top 50 in the UK and is 11 in Ireland (really high competitive term), but this is to be expected regardless of how we ran the campaign, but the LTK and less competitive terms are already ranking first page already ...
We have been totally able to optimize each website for its respective country, from link building through to dynamic content
So does it work, yes, is it worth it ,well that depends on your business and clients situation, is it more expensive, well that depends on how your look at your budget, or your clients.
But one thing I can say for sure, would we have achieved the success we have for the client so far had we one website targeting two separate counties, not on your Nelly would we have, no way by 3 months into the campaign.
Just thought bit of perspective from someone going through the situation live might help
Regards
John
-
Hi Steve,
Egol is completely right. This approach is very expensive to maintain and Google frowns upon duplicate content.
This approach is very passe
-
Ten years ago lots of people had lots of sites competing for the same keywords. The shortcut that they usually took was to use the same content on all of those sites. That approach is dead now... and the approach of using slightly modified text is dead too.
Google doesn't care if you have two websites on the SERPs for the same keywords as long as those two websites are absolutely unique and offers value to visitors. I have info sites and retail sites in the same SERPs. No problem -- at least none yet.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
I lost traffic from my website and the rankings also gone down... What should I do?
I've started working on a project recently (for 20 days) and it got average 1800 visitors per day and the ranking were seemed good. When I started that project, I saw that there were too many plugins installed. I removed unnecessary plugins and keep the importance ones. And I started modify some pages considering SEO perspective. Few days ago I created a backlink in reddit; from where I got way too much traffic in the site at the time and the server gone down. So I have to change the server. Now I see the drastic drop down in both ranking and the traffic. I am wondering if the ranking affected when I changed the server? Or Is there any other way to check why my ranking and traffic gone down?
White Hat / Black Hat SEO | | HuptechWebseo0 -
How can I remove Japanese hacker in my site
Hello, How can I remove Japanese hacker in my site?? here i have attached screen shot for it , http://prntscr.com/cmxmmx My website is hacked From long, please help out to solve this Problem Thnx in advance
White Hat / Black Hat SEO | | poojaverify060 -
How to save website from Negative SEO?
Hi, I have read couple of good blog post on Negative SEO and come to know about few solution which may help me to save my website during Negative SEO. Here, I want to share my experience and live data regarding Negative SEO. Someone is creating bad inbound links to my website. I come to know about it via Google webmaster tools. Honestly, I have implemented certain solutions like Google disavow tool, contact to certain websites and many more. But, I can see negative impact on organic visits. Organic visits are going down since last two months. And, I am thinking, These bad inbound links are biggest reasons behind it. You can visit following URLs to know more about it. Can anyone share your experience to save website from negative SEO? How can I save any website from Negative SEO (~Bad Inbound Links) https://docs.google.com/file/d/0BxyEDFdgDN-iR0xMd2FHeVlzYVU/edit https://drive.google.com/file/d/0BxyEDFdgDN-iMEtneXU1YmhWX2s/edit?usp=sharing https://drive.google.com/file/d/0BxyEDFdgDN-iSzNXdEJRdVJJVGM/edit?usp=sharing
White Hat / Black Hat SEO | | CommercePundit0 -
Secondary Domain Outranking Master Website
IEEE is a large professional association dedicated to serving engineers. The IEEE Web Presence is made up of flagship sites like IEEE.org, IEEEXplore, and IEEE Spectrum, mid-tier sites like Computer.org, and smaller sites like those dedicated to specific conferences. It is unclear exactly when this started - but searches in Google for [ieee] currently return ieeeusa.org before ieee.org. This is troublesome, as users are typically looking for IEEE.org with such a general query. ieeeusa.org is a site that has a much narrower focus - it is dedicated to public policy. IEEE.org is one of the strongest domains - I am thinking that this is a glitch of some sort. I am removing a stale sitemap that is referenced in robots.txt (though again, I'm not seeing any issues with other pages - its just two queries that are trouble: [ieee] and [about ieee]. And its noticeable in analytics 🙂 http://ieee.d.pr/hMg0/YhklCw7Z What do you think? 🙂
White Hat / Black Hat SEO | | thegrif3290 -
Can a hidden menu damage a website page?
Website (A) - has a landing page offering courses Website (B) - ( A different organisation) has a link to Website A. The goal landing page when you click on he link takes you to Website A's Courses page which is already a popular page with visitors who search for or come directly into Website A. Owners of Website A want to ADD an Extra Menu Item to the MENU BAR on their Courses page to offer some specific courses to visitors who come from Website (B) to Website (A) - BUT the additional MENU ITEM is ONLY TO BE DISPLAYED if you come from having clicked on the link at Website (B). This link both parties are intending to track However, if you come to the Courses landing page on Website (A) directly from a search engine or directly typing in the URL address of the landing page - you will not see this EXTRA Menu Item with its link to courses, it only appears should you visit Website (A) having come from Website (B). The above approach is making me twitch as to what the programmer wants to do as to me this looks like a form of 'cloaking'. What I am not understanding that Website (A) URL ADDRESS landing page is demonstrating outwardly to Google a Menu Bar that appears normal, but I come to the same URL ADDRESS from Website (B) and I end up seeing an ADDITIONAL MENU ITEM How will Google look at this LANDING PAGE? Surely it must see the CODING INSTRUCTIONS sitting there behind this page to assist it in serving up in effect TWO VERSIONS of the page when actually the URL itself does not change. What should I advise the developer as I don't want the landing page of Website (A) which is doing fine right now, end up with some sort of penalty from the search engines through this exercise. Many thanks in advance of answers from the community.
White Hat / Black Hat SEO | | ICTADVIS0 -
Google is giving one of my competitors a quasi page 1 monopoly, how can I complain?
Hi, When you search for "business plan software" on google.co.uk, 7 of the 11 first results are results from 1 company selling 2 products, see below: #1. Government site (related to "business plan" but not to "business plan software")
White Hat / Black Hat SEO | | tbps
#2. Product 1 from Palo Alto Software (livePlan)
#3. bplan.co.uk: content site of Palo Alto Software (relevant to "business plan" but only relevant to "business plan software" because it is featuring and linking to their Product 1 and Product 2 sites)
#4. Same site as #3 but different url
#5. Palo Alto Software Product 2 (Business Plan Pro) page on Palo Alto Software .co.uk corporate site
#6. Same result as #5 but different url (the features page)
#7. Palo Alto Software Product 2 (Business Plan Pro) local site
#8, #9 and #10 are ok
#11. Same as #3 but the .com version instead of the .co.uk This seems wrong to me as it creates an illusion of choice for the customer (especially because they use different sites) whereas in reality the results are showcasing only 2 products. Only 1 of Palo Alto Software's competitors is present on page 1 of the search results (the rest of them are on page 2 and page 3). Did some of you experience a similar issue in a different sector? What would be the best way to point it out to Google? Thanks in advance Guillaume0 -
Can anyone tell me why this site ranks so well?
Site in question: cellphoneshop.net From what I can tell from their link profile, the links they garner don't appear to be particularly high value but they dominate organic listings for my vertical (cell phone accessories), esp. in the last 2-3 months when Google was supposedly increasing the quality of their search results. Can anyone tell me why in particular this site ranks so well for competitive short and long tail terms?
White Hat / Black Hat SEO | | eugeneku0