Is there a danger linking to and from one website too many times?
-
Basically my webdeveloper has suggested that instead of using a subfolder to create an English and Korean version of the site I should create two different websites and then link them both together to provide the page in English, or in Korean, which ever the case may be.
My immediate reaction is that search engines may perceive this kind of linking to be manipulative, as you can imagine there will be a lot of links (One for every page).
Do you think it is OK to create two webpages and link them together page by page?
Or do you think that the site will get penalized by search engines for link farming or link exchanging.
Regards,
Tom
-
Unfortunately the site is not wordpress so I dont think I'lll be able to use WPMU DEV for my site.
Thanks
-
Hi Thomas,
I would ask the question from another angle:
Would a typical user who searched in English be interested in the same version of the page in Korean? Would that page bring extra value for the user? Does he really need it?
Would a typical user who searched in Korean be interested in the same version of the page in English? Would that page bring extra value for the user? Does he really need it?
If yes you should certainly do it - you should link each page together.
There are some cases in which this is useful:
-
sometimes people need/want to read official documents in other languages to be sure they were informed correctly
-
for some translation purposes when person must verify the text
-
when people want to learn the actual language
-
and there are other cases with legal documents and so on...
If the answer to above questions is no, then probably it's not a good idea to link every page. Maybe just general link in the footer/header like the typical websites do.
-
-
Hi Thomas,
If the reason for the second version of the site is purely for the users, then you can nofollow all of these page to page links and just leave one followed link on a few pages of each site to the other language's home page. If you nofollow the links, I can't imagine that you will run into any trouble, and by having two different domains you may have the opportunity to perform well in South Korea specifically with a www.yoursite.kr domain for example.
You're not going to be penalized for link selling/buying, especially if you nofollow most of them, because there is a clear and logical reason why those links should be there.
Hope this helps,
Chris
-
Hi Tom,
There is so much more to this discussion, including local-language SEO in Korean. But here's my contribution:
1. you won't get penalised if you use rel=canonical extensively
2. Your developer is almost certainly wrong (though without knowing more, I wouldn't want to judge).
-
Why you have to create two different sites or your can use the some plugins or WPMUDEV plugin as they provide premium plugin to translate your page !
As per my knowledge i think Google will take it as duplicate conent !
Plz do not keep my word as acceptables. I shared my thoughts ! hope for better or accurate reply !
But i will never create a two sites for same theme with different lanugage!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
Suggest me a best plan for linking building chart for small static website.
Hi Everyone, Can any one suggest me a clear idea for off page link building chart i.e) Our page is a 24 page website we like to plan for off page activity like bookmarking, classifieds, directory bla bla bla. So how many links we are supposed to post and in how much day time gap example: 15 Links in bookmarking, 10 links in classified, weekly one article submission, after one week the same cycle goes on.....
White Hat / Black Hat SEO | | dineshmap0 -
Should I Disavow Links if there is No Manual Action
Hello, I just recently took on a client that had hired a very black hat seo and used their service for roughly two years. He outsourced link building and the link profile is full of spun articles and blog commenting on chinese websites etc… The anchor texts/pages used for all this spamming no longer rank, but there is no penalty in Webmaster tools manual actions. I was thinking about disavowing some of the obviously spammy backlinks that exist but would that be raising a red flag that could lead to a manual action and even more negative movement? Have you ever heard of anything like the situation i'm dealing with where its obvious the pages have been hit but there is no manual action? What do you all think/suggest? And Should I disavow some terrible links and potentially open a can of worms?
White Hat / Black Hat SEO | | Prime850 -
What is your SEO agency doing in terms of link building for clients?
What are you or your SEO agency doing for your client's link building efforts? What are you (or the agency) doing yourself, or out-sourcing, or having the client do for link building? If a new client needs some serious link building done, what do you prescribe and implement straight off the bat? What are your go-to link building tactics for clients? What are the link building challenges faced by your agency in 2013/2014? What's working for your agency and what's not? Does your agency work closely with the client's marketing department to gain link traction? If so, what are collaborating on? What else might you be willing to share about your agencies link building practices? Thanks
White Hat / Black Hat SEO | | Martin_S0 -
Will Google perceive these as paid links? Thoughts?
Here's the challenge. I am doing some SEO triage work for a site which offers a legitimate business for sale listing service, which has a number of FOLLOWED link placements on news / newspaper sites - like this: http://www.spencercountyjournal.com/business-for-sale. (The "Business Broker" links & business search box are theirs.) The site has already been penalized heavily by Google, and just got pushed down again on May 8th, significantly (from what we see so far). Here's the question - is this the type of link that Google would perceive of as paid / passing page rank since it's followed vs. nofollowed? What would you advise if it were your site / client? From everything I've read, these backlinks, although perfectly legit, would likely be classified as paid / passing pagerank. But please tell me if I'm missing something. My advice has been to request that these links be nofollowed, but I am getting pretty strong resistance / lack of belief that these links in their current state (followed) could be harming them in any way. Would appreciate the input of the Moz community - if they won't believe me, and the majority here agrees about nofollowing, maybe they'll believe you. Thanks! BMT
White Hat / Black Hat SEO | | CliXelerate1 -
Do sitewide links from other sites hurt SEO?
A friend of mine has a pagerank 3 website that links to all my pages on my site on every page of his site. The anchor text of all these links are the title of each page that it links to. Does this hurt SEO? I can have him change to the links to whatever i want, so if it does hurt, what should i change the anchor text to if needed? Thanks mozzers! Ron
White Hat / Black Hat SEO | | Ron100 -
Best way to build links?
i want to build high piority links and some high pr one's. what tool should i use? i was thinking of using scrapbox. any insights? i already have 2 high ones from youtube and google +1
White Hat / Black Hat SEO | | Radomski0 -
Why is Google not punishing paid links as it says it will?
I've recently started working with a travel company - and finding the general link building side of the business quite difficult. I had a call from an SEO firm the other day offering their services, and stating that they had worked with a competitor of ours and delivered some very good results. I checked the competitors rankings, PR, link profile, and indeed, the results were quite impressive. However, the link profile pointed to one thing, that was incredibly obvious. They had purchased a large amount of sidebar text links from powerful blogs in the travel sector. Its painfully obvious what has happened, yet they still rank very highly for a lot of key terms. Why don't Google do something about this? They aren't the only company in this sector doing this, but it just seems pointless for white hats trying to do things properly, then those with the dollar in their pockets just buy success in the SERPS. Thanks
White Hat / Black Hat SEO | | neilpage1230