Offer of Channel Partner to host a microsite - should I be doing this.
-
Hi All.
I work for a regional IT services business and we supply various IT solutions. One of our channel distributors is running a campaign with a large global vendor (you will know them) where they have built a solution microsite at https://vendorname-solution.com for themselves and then created a copy of this site for ten end partners, including my business.
They have done this by what looks like copying the entire site and creating a copy of the site at a subdomain for each of the ten partners at http://partner1/2/3 etc.vendorname-solution.com.
So if we go ahead and agree to this approach I am potentially worried about the following and whether I should built out our own version instead.
1 - there is no https being offered so we will get penalised by Google ?
2 - we can't add any tracking to the subsite as it isn't under our control
3 - will Google see all these subdomain copies as duplicate content and penalise me (and the others)
4 - I am worried that anyone removing the subdomain from the URL will then land on the distributors microsite and not ours and the only way of trying to prevent this is to embed in an iframe but that doesn't sound a good idea to me.I don't get the feeling that the channel partner knows much about SEO so could do with some help trying to assess whether I should be concerned and politely politely turn down their offer to run this microsite for us ?
Thanks in advance for your comments
-
Hi Nigel
Thanks for your advice and confirming my concerns.
Regards
Gavin
-
Hi Gavsta
What they are offering is essentially a copy of your website on a micro-site within their own framework.
If you are happy with them doing this because you feel it will help generate more business for you then, by all means, go ahead. They must, however, Noindex the entire micro-site so that it does not appear in Google. If they allow it to rank then it will completely destroy all of your SEO efforts and your rankings will plummet. Google does not handle duplicate content well - it would simply not know which version to rank so would end up raking neither.
I suffered a huge penguin penalty in 2012 for trying to rank micro-sites which pre 2012 worked very well - don't fall into that trap. If they say no then you can't allow them to set it up at all.
Regards
Nigel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product schema with no offer as owner wants to give price per customer
Hi, Trying to markup products for a site that does not show prices. Is there any way to markup a product price when the business model is: 1. customer calls or contacts shop. 2. shop gives a price quote based on level of detail and finish on the product 3. there is no base or top price. Thanks in advance!
Technical SEO | | plahpoy0 -
Schema.org product offer with a price range, or multiple offers with single prices?
I'm implementing Schema.org, (JSON-LD), on an eCommerce site. Each product has a few different variations, and these variations can change the price, (think T-shirts, but blue & white cost $5, red is $5.50, and yellow is $6). In my Schema.org markup, (using JSON-LD), in each Product's Offer, I could either have a single Offer with a price range, (minPricd: $5, maxPrice $6), or I could add a separate Offer for each variation, each with its own, correct, price set. Is one of these better than the other? Why? I've been looking at the WooCommerce code and they seem to do the single offer with a price range, but that could be because it's more flexible for a system that's used by millions of people.
Technical SEO | | 4RS_John1 -
Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts. I really appreciate it!
Technical SEO | | JoshuaLindley1 -
One hosting plan for multiple websites?
I use one Godaddy shared Linux hosting account for 4 separate websites. In Google Webamster Tools, specifally "Sitr Errors," I noticed that inner pages from another site are being listed as a broken link in the original unique-now-shared site. I checked and the files are not mi-installed. My question is, should each of the four sites have a unique hosting plan and/or static IP? Thanks, Eric
Technical SEO | | monthelie10 -
Do different hosting IP addresses really matter?
Hi all It used to be (allegedly) the case that you should have all your sites on different Class Cs or Google would hit you with the spam hammer. Which I guess made some sense because back then they probably didn't have many other ways of detecting unnatural link networks. But today with all their data on who is related to who, can this really matter any more? I'd like to move 3 or 4 of our sites (all long-established with widely varied link sources) onto one server, one CMS install, one less headache but I wanted to check first in case I'm about to shoot myself in the foot. Thanks Roger
Technical SEO | | RogerElliott0 -
The best managed hosting companies for WordPress
I'm looking for a reliable managed hosting company specializing in hosting WordPress sites with 24-hour phone support. What companies would you recommend?
Technical SEO | | translate0 -
Syndication partner ranking in Google News for our content
Our blog is part of Google News and is syndicated for use by several of our partners such as Chicago Tribune. Lately, we see the syndicator version of the post appearing in Google News instead of our original version. Ours generally ranks in the regular index. ChiTrib does have canonical URL tags and syndication-source tags pointing to our original. They are meta tags, not link tags. We do have a News-specific sitemap that is being reported in WMT as error-free. However, it shows no urls indexed in the News module -- even when I can find those specific URLs (our version) in the News. For an example: Here is a ChiTrib post currently ranking in Google News
Technical SEO | | CarsProduction
http://www.chicagotribune.com/classified/automotive/sns-school-carpool-lanes-are-a-danger-zone-20120301,0,3514283.story The original version is here:
http://blogs.cars.com/kickingtires/2012/03/school-carpool-lanes-are-a-danger-zone.html The News sitemap URL is
http://blogs.cars.com/kickingtires/kickingtires_newsmap.xml One of our front-end producers is speculating that the Facebook sharing code on ChiTrib is having an effect. Given that FB is FB and Google is Google, that sounds wrong to me when we're talking about specifically Google News. Any suggestions? Thanks.0 -
Two domains One hosting account
I have two separate domains set up on one hosting account through godaddy. Will this affect the SEO negatively? They are completely different sites with their own unique domains but on my local files the second domain is set up as a subfolder for the original hosted domain. Any thoughts? Thanks in advance.
Technical SEO | | DavetheExterminator0