Campaign landing pages
-
Hi
At our company we decided we wanted to reach out to a more global audience. So we bought a bank of domains for different countries, e.g. ".asia". Some are our company name, others are things like "barcelonaprivatejets.com."
We then put up single page websites for each of these domains, which link to our main .com site.
However, I don't know if this is good for our SEO or bad. I've seen so many different things written but I cannot find a definitive answer.
The text will be different on all the pages, but being only one page, and the "design" being the same, will we get penalized in some way or another?
I've also added links to 2/3 of them in the footer of our main site but now I'm reading that this is bad too - so should I remove these?
If anyone also has any ideas of how better we could use these Country-specific domains I would be welcome to suggestions to that too! I am not an SEO person really, I'm a web developer, so this is all completely different to me.
P.S My name is Michael not Andy.
-
if we just duplicated our homepage, would we not get penalized?
You would not receive a penalty as long as the SEO was handled properly.
If you have a page mysite.com design for the US and another page mysite.com/uk designed for England then search engines should understand clearly that each site is designed for a particular country, even though they are all in English. A few added points:
-
a ".com" site is not thought of as a US site automatically. Google makes that determination based largely in part of where your site is hosted. You can also set your site's target country in Google WMT to avoid any confusion.
-
there is a langauge meta tag which is something like "EN-US" for the United States and "EN-GB" for England. By setting that tag you will help search engines understand your target audience.
-
be certain to localize your page. For example the US says "center" while English people say "centre". There are different monetary units, systems of measurements and phrases associated with each culture as well.
if our host was able to simply put the pages on different servers, would that be sufficient?
No. A different server with the same host would not be sufficient. You need to change your C-block and usually a host will retain the same C-block for all of their servers.
having a page targeted at say, Mexico, loading from a UK server may not be great for page load times.
True, which is where cloud hosting is very helpful. The prices are fairly reasonable and it is an option you may want to explore.
-
-
Hi Michael, Ryan has covered many of the relevant points but there are a couple of things I wanted to pick up.
You mention having pages of different text for each site or region but the same design and wonder if you will get penalised. I can confirm that you won't get penalised for duplicate design but if you had duplicate content you may find that the duplicate pages do not rank as you would expect or may not be indexed.
Many companies will use the same design on different regional sites to maintain consistent branding. If the sites are selling different products or services e.g one is cars and the other is cheese then it may be appropriate to vary the design to make it fit the product better.
Hope this helps.
-
Hi Ryan
That's some really great stuff, thank you. However, surely if we just duplicated our homepage, would we not get penalized? As some of the countries we'll be targeting will also be English speaking. Additionally, all the site content (news items etc) are written in English, so I'm not sure how we would handle that?
Or - could we just have a few headers/description on the target language, but then the rest in English. We're also only an English speaking company so we want to make that clear too so we don't have any awkward emails/calls.
Additionally, thanks for pointing out regarding the host. However, if our host was able to simply put the pages on different servers, would that be sufficient? Or should we absolutely have each page on a different host? I am also thinking that having a page targeted at say, Mexico, loading from a UK server may not be great for page load times.
Thanks again for a fantastic response.
-
Hi Michael.
The practice of purchasing a number of URLs then pointing them back to your main site is much more difficult to pull off then most people realize. In most cases, companies spend their time and resources but receive minimal or no benefit.
The best practice would be to develop your main website with landing pages for each country. If your main site is privatejets.com you can have a privatejets.com/es/ page for Spain, privatejets.com/uk/ for England and so forth. Each landing page should be targeted by country, not language. For example both Spain and Mexico speak spanish, but the dialects are different along with their monetary units and cultures.
In short, if you want to use your existing URLs you need to develop each of them into a solid landing page and then provide one link back to your main site. You need to also work on these pages to get them boosted in search engines so they rank well, and finally they need to be hosted with a different hosting company then your main site. Links between sites with the same C block are not valued.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages mirrored on unknown websites (not just content, all the HTML)... blackhat I've never seen before.
Someone more expert than me could help... I am not a pro, just doing research on a website... Google Search Console shows many backlinks in pages under unknown domains... this pages are mirroring the pages of the linked website... clicking on a link on the mirror page leads to a spam page with link spam... The homepage of these unknown domain appear just fine... looks like that the domain is partially hijacked... WTF?! Have you ever seen something likes this? Can it be an outcome of a previous blackhat activity?
White Hat / Black Hat SEO | | 2mlab0 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
Massive site-wide internal footer links to doorway pages: how bad is this?
My company has stuffed several hundred links into the footer of every page. Well, technically not the footer, as they're right at the end of the body tag, but basically the same thing. They are formatted as follows: [" href="http://example.com/springfield_oh_real_estate.htm">" target="_blank">http://example.com/springfield_pa_real_estate.htm">](</span><a class= "http://example.com/springfield_oh_real_estate.htm")springfield, pa real estate These direct to individual pages that contain the same few images and variations the following text that just replace the town and state: _Springfield, PA Real Estate - Springfield County [images] This page features links to help you Find Listings and Homes for sale in the Springfield area MLS, Springfield Real Estate Agents, and Springfield home values. Our free real estate services feature all Springfield and Springfield suburban areas. We also have information on Springfield home selling, Springfield home buying, financing and mortgages, insurance and other realty services for anyone looking to sell a home or buy a home in Springfield. And if you are relocating to Springfield or want Springfield relocation information we can help with our Relocation Network._ The bolded text links to our internal site pages for buying, selling, relocation, etc. Like I said, this is repeated several hundred times, on every single page on our site. In our XML sitemap file, there are links to: http://www.example.com/Real_Estate/City/Springfield/
White Hat / Black Hat SEO | | BD69
http://www.example.com/Real_Estate/City/Springfield/Homes/
http://www.example.com/Real_Estate/City/Springfield/Townhomes/ That direct to separate pages with a Google map result for properties for sale in Springfield. It's accompanied by the a boilerplate version of this: _Find Springfield Pennsylvania Real Estate for sale on www.example.com - your complete source for all Springfield Pennsylvania real estate. Using www.example.com, you can search the entire local Multiple Listing Service (MLS) for up to date Springfield Pennsylvania real estate for sale that may not be available elsewhere. This includes every Springfield Pennsylvania property that's currently for sale and listed on our local MLS. Example Company is a fully licensed Springfield Pennsylvania real estate provider._ Google Webmaster Tools is reporting that some of these pages have over 30,000 internal links on our site. However, GWT isn't reporting any manual actions that need to be addressed. How blatantly abusive and spammy is this? At best, Google doesn't care a spit about it , but worst case is this is actively harming our SERP rankings. What's the best way to go about dealing with this? The site did have Analytics running, but the company lost the account information years ago, otherwise I'd check the numbers to see if we were ever hit by Panda/Penguin. I just got a new Analytics account implemented 2 weeks ago. Of course it's still using deprecated object values so I don't even know how accurate it is. Thanks everyone! qrPftlf.png0 -
Page not being indexed or crawled and no idea why!
Hi everyone, There are a few pages on our website that aren't being indexed right now on Google and I'm not quite sure why. A little background: We are an IT training and management training company and we have locations/classrooms around the US. To better our search rankings and overall visibility, we made some changes to the on page content, URL structure, etc. Let's take our Washington DC location for example. The old address was: http://www2.learningtree.com/htfu/location.aspx?id=uswd44 And the new one is: http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training All of the SEO changes aren't live yet, so just bear with me. My question really regards why the first URL is still being indexed and crawled and showing fine in the search results and the second one (which we want to show) is not. Changes have been live for around a month now - plenty of time to at least be indexed. In fact, we don't want the first URL to be showing anymore, we'd like the second URL type to be showing across the board. Also, when I type into Google site:http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training I'm getting a message that Google can't read the page because of the robots.txt file. But, we have no robots.txt file. I've been told by our web guys that the two pages are exactly the same. I was also told that we've put in an order to have all those old links 301 redirected to the new ones. But still, I'm perplexed as to why these pages are not being indexed or crawled - even manually submitted it into Webmaster tools. So, why is Google still recognizing the old URLs and why are they still showing in the index/search results? And, why is Google saying "A description for this result is not available because of this site's robots.txt" Thanks in advance! Pedram
White Hat / Black Hat SEO | | CSawatzky0 -
Pages higher than my website in Google have fewer links and a lower page authority
Hi there I've been optimising my website pureinkcreative.com based on advice from SEOMoz and at first this was working as in a few weeks the site had gone from nowhere to the top of page three in Google for our main search term 'copywriting'. Today though I've just checked and the website is now near the bottom of page four and competitors I've never heard of are above my site in the rankings. I checked them out on Open Site Explorer and many of these 'newbies' have less links (on average about 200 less links) and a poorer page authority. My page authority is 42/100 and the newly higher ranking websites are between 20 and 38. One of these pages which is ranking higher than my website only has internal links and every link has the anchor text of 'copywriting' which I've learnt is a bad idea. I'm determined to do whiter than white hat SEO but if competitors are ranking higher than my site because of 'gimmicks' like these, is it worth it? I add around two blog posts a week of approx 600 - 1000 words of well researched, original and useful content with a mix of keywords (copywriting, copywriter, copywriters) and some long tail keywords and guest blog around 2 - 3 times a month. I've been working on a link building campaign through guest blogging and comment marketing (only adding relevant, worthwhile comments) and have added around 15 links a week this way. Could this be why the website has dropped in the rankings? Any advice would be much appreciated. Thanks very much. Andrew
White Hat / Black Hat SEO | | andrewstewpot0 -
Google Penalising Pages?
We run an e-commerce website that has been online since 2004. For some of our older brands we are getting good rankings for the brand category pages and also for their model numbers. For newer brands, the category pages aren't getting rankings and neither are the products - even when we search for specific unique content on that page, Google does not return results containing our pages. The real kicker is that the pages are clearly indexed, as searching for the page itself by URL or restricting the same search using the site: modifier the page appears straight away! Sometimes the home page will appear on page 3 or 4 of the rankings for a keyword even though their is a much more relevant page in Google's index from our site - AND THEY KNOW IT, as once again restricting with the keywords with a site: modifier shows the obviously relevant page first and loads of other pages before say the home page or the page that shows. This leads me to the conclusion that something on certain pages is flagging up Google's algorithms or worse, that there has been manual intervention by somebody. There are literally thousands of products that are affected. We worry about duplicate content, but we have rich product reviews and videos all over these pages that aren't showing anywhere, they look very much singled out. Has anybody experienced a situation like this before and managed to turn it around? Link - removed Try a page in for instance the D&G section and you will find it easily on Google most of the time. Try a page in the Diesel section and you probably won't, applying -removed and you will. Thanks, Scott
White Hat / Black Hat SEO | | scottlucas0 -
Multiple links to different pages from same page
Hey, I have an opportunity to get listed in a themed directory page, that has a high mozRank of 4+ and a high mozTrust of 5+. Would it be better to just have one link from that page going to one of my internal product category pages, or take advantage of the 'sitelinks' they offer, that allows me to have an additional 5 anchor text links to 5 other pages? I've attached an example. sitelinks.jpg
White Hat / Black Hat SEO | | JerDoggMckoy0