As an agency, what is the best way to handle being the webmaster and hosting provider for several sites (some of which are in the same industry and have natural links to each other)?
-
We are an agency that builds and hosts websites for several companies (some of which happen to be in the same industry - and therefore naturally link to each other - we do not dictate). In regards to handling their domain registrations, webmaster tools account, google analytics account, and servers, what is the best practice to avoid Google thinking that these companies are affilliated? Even though they aren't affiliated, we are afraid that us being the "webmaster" of these sites and having shared servers for them that we may be affecting them.
-
Grayloon, were these responses enough of an answer for you, or are you still looking for more information?
-
I think there may actually be two issues here: One with regard to the sites all linking and one with regard to GA accounts. After Keri's answer (thanks Keri), I did a little research regarding what is the best practice. (Somewhere, sometime, I learned you set all up in a master and then move forward....). Appears I was wrong.....
Almost anyone within SEO, SEM, etc. I could find with a blog or other venue stated that the best practice was to set each client up with their own analytics account since it allows portability. With regard to Adwords accounts, that should be set up in advance in a contract: Marketing Co or Customer will possess AdWords account on termination of services. For us, if client is to retain, I require a cc on file that is used to pay all invoices as they occur.
Hope this helps.
-
There are sites out there that will show you which sites share the same Google Analytics code, http://spyonweb.com/ is one. Google knows, trust me, as I've had them shut off adwords for multiple clients when one client refused about $36 in charges (five years ago, no MMC, but all in same GWT account).
I don't have an educated opinion on the impact of ranking that this may or may not cause, and will defer to others about that.
I personally create separate Google Analytics Accounts for each client, with only their profiles in that same account. It's less about Google knowing all and more about being able to hand off the GA account to another business and give them admin access without handing over the rest of my clients.
-
We are an SEO, SEM, WebDev, etc. firm. We handle several verticals with clients who do not directly compete. Frankly, we don't like having those who do (unless in a different region or highly niched). With the web dev we are webmaster, host, and etc. Typically, we won't take on a client unless we are the ones handling the hosting as we have had too many times when we were blocked or stalled by Danny the Developer and it is not worth the pain. (That is why we got into dev).
Most of our clients have multiple sites and we do cross link carefully where it is logical. By virtue of this, we actually have sites that are hosted on various site hosts nationwide (GoDaddy, BlueHost, Network Solutions, etc. etc. etc.). We develop a matrix around the sites and groups that will link and then insure we are not pulling the IP or C blocks in a way that shows all sites on same. Furthermore, we utilize the matrix in a way that If there are say 4 groups with 10 sites each, Group A can only link to Group B or C and Group B only to C or D. D can link to A, etc. - I am not endeavoring to fill in the entire picture, but believe you can figure it out.
Early on, we were link monkeys going everywhere, connecting everything. Could not figure out what the issue(s) were. When we learned from some mozzers about the IP and C block issues, it changed our clients outcomes. I suggest that anyone that is "cross pollinating" multiple sources set up your sites in a fashion that keeps the IP and C blocks from showing up as all from the same.
-
I have a lot of sites on only a few ip numbers, i have them all on the same Bing and
Google accounts as i host and build websites. I also have a lot of linking
between them. I have recently had a site drop in ranking, but i doubt it is
because of this, as others that have not dropped have more same ip links.
DiscountASP one of the biggest hosters has almost all sites on one IP number. I
think the concern is overblown.I would just make sure that same ip links do not make up the majority of your links.
Bing or Google understand that people are in my and your position, where you are a
webmaster for many sites and host them on a few ip numbers, it is natural.what is not natural is to a link farm
-
I'm curious to see what answers you get as I'm somewhat in the same boat. I do have multiple clients hosted with the same hosts, but there's not much cross over in industries. I do, however always set up completely individual GWT, analytics etc just because of a hunch that Google may consider two sides sharing an account somewhat connected. Whether anyone has any proof is another thing..
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site address change: new site isn't showing up in Google, old site is gone.
We just transitioned mccacompanies.com to confluentstrategies.com. The problem is that when I search for the old name, the old website doesn't come up anymore to redirect people to the new site. On the local card, Google has even taken off the website altogether. (I'm currently still trying to gain access to manage the business listing) When I search for confluent strategies, the website doesn't come up at all. But if I use the site: operator, it is in the index. Basically, my client has effectively disappeared off the face of the Google. (In doing other name changes, this has never happened to me before) What can I do?
Technical SEO | | MichaelGregory0 -
Maintaining link value during site downtime
We are nearly finished rebuilding a client website, but they want to have a "dark launch" period for 4 days prior to the public site launch. During that 4-day period, we will be converting their server, so they want to take down the old site and instead send users a "coming soon" message. Although we have the old site pages set up to 301 for the public launch, I'm concerned that this dark period is going to hurt the link value on the old site pages. During this 4-day period, should we be setting a 503 status code on the old site that automatically serves the "coming soon" message? Or, should all old site pages be temporarily redirected to the "coming soon" landing page? Any other recommendations are appreciated as well.
Technical SEO | | AHartman2 -
What's the best way to pass link juice to a page on another domain?
I'm working with a non-profit, and their donation form software forces them to host their donation pages on a different domain. I want to attempt to get their donation page to appear in their sitelinks in Google (under the main website's entry), but it seems like the organization's donation forms are at a disadvantage because they're not actually hosted on that site. I know that no matter what I do, there's no way to "force" a sitelink to appear the way I want it, but... I was trying to think if there's a way I can work around this. Do you think 1) creating a url like orgname.org/donate and having that be a 301 redirect to the donation form, and 2) using the /donate redirect all over the site (instead of linking directly to the form) would help? Are there alternatives other folks recommend?
Technical SEO | | clefevre0 -
Unnatural links from your site
Hi, 24 February got this penalty message in Google webmaster tool. Google detected a pattern of unnatural, artificial, deceptive, or manipulative outbound links on pages on this site. This may be the result of selling links that pass PageRank or participating in link schemes. Already removed all the link on the blog and sent reconsideration request to Google spam team. But request is rejected. Please help me on this or share link with me on same case. Thanks,
Technical SEO | | KLLC0 -
How can I best handle parameters?
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning! The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here: Focus (ex: “Data Science”) Cost (ex: “$<5000”) City (ex: “Chicago”) State/Province (ex: “Illinois”) Country (ex: “Canada”) When a filter is applied to the directories page the CMS produces a new page with URLs like these: coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork My questions: 1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls. coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago VERSUS coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all) 2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city? Should I be changing page titles for the unique filtered URLs? I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ? https://support.google.com/webmasters/answer/1235687 An assortment of the other stuff I’ve read for reference: http://www.wordtracker.com/academy/seo-clean-urls http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/ http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
Technical SEO | | alovallo0 -
Best way to present a single image from a gallery?
Hi. I want to make a page for each image on my clients gallery, so that each image page will be indexed and rank. (It's a tattoo portal and theres a lot of traffic on specific tattoos) But as the setup is for now, the only way it will be different from the other image pages, is the H1 title. Can you guys give some examples on "spot on" galleries SEO-wise, i could draw some ideas from?
Technical SEO | | MichaelRoscoe0 -
Best way to fix a whole bunch of 500 server errors that Google has indexed?
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not. In any case, there are now thousands of these pages in their index that error out. If I wanted to simply remove them all from the index, which is my best option: Disallow all 1,000 or so pages in the robots.txt ? Put the meta noindex in the headers of each of those pages ? Rel canonical to a relevant page ? Redirect to a relevant page ? Wait for Google to just figure it out and remove them naturally ? Submit each URL to the GWT removal tool ? Something else ? Thanks a lot for the help...
Technical SEO | | jim_shook0 -
Are lots of links from an external site to non-existant pages on my site harmful?
Google Webmaster Tools is reporting a heck of a lot of 404s which are due to an external site linking incorrectly to my site. The site itself has scraped content from elsewhere and has created 100's of malformed URLs. Since it unlikely I will have any joy having these linked removed by the creator of the site, I'd like to know how much damage this could be doing, and if so, is there is anything I can do to minimise the impact? Thanks!
Technical SEO | | Nobody15569050351140