What shall we do to increase organic traffic for both the sites?
-
We have 2 English language bookselling websites targeted for USA and UK audience with some content variations but 60%-70% of the content on site remains the same.
The site is hosted in USA. One is hosted on brandname.co.uk (for United Kingdom) and one is on brandname.com(For United States)
The precautions that we have already taken to save it from being marked duplicates are:
- used rel=alternate element for all product detail pages
- currency in both the sites are different that is GBP and USD
- Geotarget settings done for USA and UK respectively through webmaster tools Launched backlink campaign to gain from local sites and directory respectively
- have tried to differentiate the product by using different product specific terms like Publishing Year: vis a vis published in:
Author Vis a vis Written by
Format vis a vis binding type
Add to cart vis a vis add to basket and so on
What remains the same:
1. Product detail pageTitle
2. Product detail page Meta Description
3. Product Name- That is Title of the book which cannot be changed anyways
4. About the product text –The memo or “About the Book” that comes along with the book and is same across each booksellers site. This is impossible to be differentiated as the database have more than 3 Million books
brandname.com which is older (2 Yrs) of the two has traffic around 2500-3000 daily visits while the brandname.co.uk which is 5 months old has 150-200 daily visitors. Both sites have around 4 Million pages.
Need suggestions what are we doing wrong and what extra can be done to boost organic traffic?
-
Thanks Irving for understanding it. Here are the links for both sites
-
- .co.uk sites have a better chance of ranking in google UK still.
- he's doing all the right things to send the right signals to Google.
- Google doesn't care where the UK site is hosted.
- he's geotargeting so even if it's duplicate content it's getting filtered properly.
- there is not really any problem as to how this is implemented if he has no problem with maintaining two sites.
What can be done to boost traffic is a different question. Better on page SEO, if we had the URL we could offer suggestions there.
-
Thanks for the answer but it is really difficult to undo what has been done. We are trying to reach out as many country as we can with multiple TLDs. We have launched 2 and in next month going to come up with .de and .jp with translated content in local languages.
The idea to have co.uk, .com. .jp. .de was purely brand management, localization of currency, language barrier, some operational convenience, accounting etc.
Is there anything that can be done with the existing strategy to boost traffic?
-
I totally agree with you, although people don't like to receive answers that might brake their strategy. .co.uk is pointless.
-
Firstly, I have to ask why you are using both the .co.uk and .com domains? You really only need the .com as you should be able to get rankings in the UK relatively easily with the .com. Of course it is nice to have the .co.uk for brand/reputation management but it isn't necessary to rank in the UK. Also when you say the site is hosted in the US, I assume your hosting for the .co.uk domain is also in the US? If this is the case, then it is kind of pointless having the .co.uk domain in the first place.
My advice would be to consolidate your sites into one, using the .com domain as the primary domain considering it is older and has more authority. After that you will need to make sure that it contains both US and UK specific content (prices £/$, shipping etc.). Then you will need to change your link building to incorporate a percentage of UK based links.
This way you should be able to rank well in both the US and UK, without any duplicate content worries.
Hope that helps,
Adam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ignore SEO and traffic/calls dropped
I got caught up in a disaster of a website that consumed 3/4 of a year, all the while our SEO was put on the backburner with only occasional "massaging" to keep it going. Now our traffic has dropped almost 25% along with our incoming calls. Our rankings have remained steadily good- medium but definitely not bad during this period. My question: Who else has had this experience and how long did it take to rebound? TY
On-Page Optimization | | KevnJr
KJr0 -
Responsive site.com vs m.site.com
Hi All, My client's website have two urls like: site.com/a.html and **m.site.com/a.html. ** Will it hurt google rankings for this website because there are version of a website? Please help!
On-Page Optimization | | binhlai1 -
Should I establish a Webmaster Tools site per language?
Our company services multiple regions. Our website is setup for France, International (en-gb), Spanish, United States and China. The websites share a common template and format and menu tree but each has unique content. Should I use a single Google Webmaster Tools site for all of these or establish each of them as a unique website?
On-Page Optimization | | bearpaw0 -
Multilingual site with untranslated content
We are developing a site that will have several languages. There will be several thousand pages, the default language will be English. Several sections of the site will not be translated at first, so the main content will be in English but navigation/boilerplate will be translated. We have hreflang alternate tags set up for each individual page pointing to each of the other languages, eg in the English version we have: etc In the spanish version, we would point to the french version and the english version etc. My question is, is this sufficient to avoid a duplicate content penalty for google for the untranslated pages? I am aware that from a user perspective, having untranslated content is bad, but in this case it is unavoidable at first.
On-Page Optimization | | jorgeapartime0 -
Directory site with an URL structure dilemma
Hello, We run a site, which lists local businesses and tag them by their nature of business (similar to Yelp). Our problem is, that our category and sub-category(i.e.: www.example.com/budapest/restaurant or www.example.com/budapest/cars/spare-parts) pages are extremely weak, and get almost no traffic, but most of the traffic (95+ percent) goes for the actual business pages. While this might be a completely normal thing, I still would like to strengthen our category (listing) pages as well, as these should be the ones targeted by some of general keywords, like ‘restaurant’ or ‘restaurant+budapest’. One of the issues I have identified as a possible problem, that we do not have a clear hierarchy within the site, so while the main category pages are linked from the homepage (and the sub-categories from here), there is no bottom-up linking from the business pages back to the category pages, as the business page URLs look like this: www.example.com/business/onyx-restaurant-budapest. I think, that the good site- and url structure for the above would be like this: www.example.com/budapest/restaurant/hungarian/onyx-restaurant. My only issue is, perhaps not with the restaurants but with others, that some of the businesses have multiple tags, so they can be tagged i.e. as car saloon, auto repair and spare parts at the same time. Sometimes, they even have 5+ tags on them. My idea is, that I will try to identify a primary tag for all the businesses (we maintain 99 percent of them right now), and the rest of their tags would be secondary ones. I would then use canonicalization and mark the page with the primary tag in the url as the preferred one for that specific content. With this scenario, I might have several URLs with the same content (complete duplicates), but they would point to one page only as the preferred one, while our visitors could still reach the businesses in any preferred ways, so either by looking for car saloons, auto-repair or spare parts. This way, we could also have breadcrumbs on all the pages, which now we miss completely. Can this be a feasible scenario? Might it have a side-effect? Any hints on how to do it a better way? Many thanks, Andras
On-Page Optimization | | Dilbak0 -
Changing my site (dramatically)
I am about to do a complete site change. I am going to WordPress. I am ranked #2 on SERPS. Will I lose rank for changing everything on my site? I have 500 pages indexed but I am about to have 30k indexed. It is a real estate site that is switching from a "framed" solution, to a listing indexed solution. If I make good use of my keywords etc (on site optimization) will I be at risk of losing risk just for changing my site?
On-Page Optimization | | JML11790 -
Exponentially Increasing Duplicate Content On Blogs
Most of the clients that I pick up are either new to SEO best practices, or have worked with sketchy SEO providers in the past, who did little more than build spammy links. Most of them have deployed little if any on-site SEO best practices, and early on I spend a lot of time fixing canonical and duplicate content issues alla 301 redirects. Using SEOMOZ, however, I see a lot of duplicate content issues with blogs that live on the sites I work on. With every new blog article we publish, more duplicate content builds up. I feel like duplicate content on blogs grows exponentially, because every time you write a blog article, it exists provisionally on the blog homepage, the article link, a category page, maybe a tag page, and an author page. I have a two-part question: Is duplicate content like this a problem for a blog -- and for the website that the blog lives on? Are search engines able to parse out that this isn't really duplicate content? If it is a problem, how would you go about solving it? Thanks in advance!
On-Page Optimization | | RCNOnlineMarketing0 -
Best information organization for a new site?
I'm launching a new stain removal website, and wanted to know what would be considered the best way to organize the content? Since most articles will roughly involve "removing X from Y" or "how to remove Z," I can see two ways... 1. Organize articles by Stained Items, Stain Agents and perhaps Cleaning Detergents. 2. Spread the categories out more, to try and group stained items according to categories... E.g. Hard surfaces, delicates, fabrics, ceramics etc. Any thoughts on which of these two might be the best way to organize the site, or are there any better suggestions? Not sure what the main considerations are here... Either of these two seem equally user-friendly.
On-Page Optimization | | ZakGottlieb710