Is bad English detected by Google
-
Hi,
I am based in the UK and in a very competitive market - van leasing - and I am thinking about using an Indian SEO company for my ongoing SEO.
They have sent me some sample artilces that they have written for link building and the English is not good.
Do you think that google can tell the difference between a well written article and a poorly written article? Will the fact that articles are poorly writtem mean we will lose potential value from the link?
Any input would be much appreciated.
Regards
John J
-
Thanks for the responses. I think I will stay away from the Indian SEO companies.
It really was for link building and not onsite stuff but it still does not seem like the best way forward.
Regards
John
-
Matt Cutts has stated in the past that poorly translating pages into another language (i.e. dumping out a raw translation) could get you devalued. Now, he's talking primarily about duplicate content but it seems that he's hinting that poor grammar could also play a role in evaluations. At the bare minimum, it could affect your bounce rate, a known SEO factor.
Let's put aside the SEO role for a second. I'm a customer who just found your site, written by your India firm. The grammar looks worse than my daughter's (she's in first grade) and is a chore to read, let alone understand. Am I going to stay and listen to/buy anything else on your site? Nope. I'll go to your competitor or I'll just give up. And you can forget any tertiary SEO benefit of my linking your article except to ridicule it. From a business standpoint it doesn't make sense. It's sloppy and people hate sloppy (unless you're selling really good hamburgers, which you're not).
If you still don't think it's important, check out Engrish. I hope you don't wind up there!
-
I agree w/ @kevinb. Google & Bing track results like high user engagement, low bounce rates, etc. Check out the infographic below.
If these articles aren't useful to users, Google will notice.
-
Awkward syntax and poor or incorrect use of idiom erect roadblocks to the flow of a narrative, depreciating the user experience.
It's been my experience that when a writer attempts to replicate a particular cultural context that is not natural to him or her, the user will recognize its artificiality—even if only on a subconscious level. An analogy would be a motion picture with dubbed—rather than subtitled—dialog: There's something that's just off.
According to Google user experience trumps, doesn't it? (See, I used an idiom right there!) So, for what its worth my advice would be to stay away.
-
Even if Google can't detect poor English now, it will be working towards it.
Surely your money is better spent elsewhere. Invest in the long term.
If the articles they are writing for you are low quality, you can bet the sites they are able to get them on are low too.
Keep away from them and work on quality. Nothing is quick and easy and that's how it should be. If people could so easily buy their way to the top, the search results wouldn't be worth using.
-
Do yourself a favour, stay away from this out-dated and damaging technique!
Create some amazing content on your own site/blog......examples could be how to reduce insurance costs when leasing a van or the best vans to hire for home removals etc etc.
Make your content the go to source for that particular problem then start contacting other webmasters of similar (non-competitor) sites to share/link so their readers benefit!
The game has changed a lot from when you could buy 50 articles from Indian SEO firms for less than £20 and churn out for links from low quality sites!
-
Wesley & Jesse hit the nail on the head. Don't do it. Even if Google possible can't detect it directly, they can spot it indirectly in the means of user experience.
Is the only reason you are using this team is price?
-
I'm not sure if Google if able to tell the difference between good or bad English at this moment.
But i do know that this is one of the criteria which they want a website to rank as is described in this document about Google Panda: http://googlewebmastercentral.blogspot.nl/2011/05/more-guidance-on-building-high-quality.htmlThis method is not permitted though and you may have a benefit for this on the short term, but i can tell you that it won't be long before you will get a penalty for this technique. Link building is not about buying links in any form. It's about creating awesome content that people want to share just because they think it is awesome.
Of course reaching out to people is also part of the process. But the key is always to make sure that you have to create a site that people **want **to link to because it is awesome of because their website will get better from it because your website offers great value to their visitors.
Always keep this in mind
-
What Google definitely does recognize is the exact services you are considering. Google's webspam team developed Penguin specifically to target sites that have subbed out SEO to blackhat organizations. What you are describing is exactly what they are targeting.
Don't do it! You WILL be sorry.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Export domain reference full on google search console
Hi all My website is https://simthanglong.vn/ and it have +7000 Referring domains from competitor use tools bad backlink. i want to disavow it but Google Search Console accept export up to 1000 domains. So, What I have to do. Help me Please.67oetuz.jpg
White Hat / Black Hat SEO | | simthanglongdotvn0 -
JavaScript encoded links on an AngularJS framework...bad idea for Google?
Hi Guys, I have a site where we're currently deploying code in AngularJS. As part of this, on the page we sometimes have links to 3rd party websites. We do not want to have followed links on the site to the 3rd party sites as we may be perceived as a link farm since we have more than 1 million pages and a lot of these have external 3rd party links. My question is, if we've got javascript to fire off the link to the 3rd party, is that enough to prevent Google from seeing that link? We do not have a NOFOLLOW on that currently. The link anchor text simply says "Visit website" and the link is fired using JavaScript. Here's a snapshot of the code we're using: Visit website Does anyone have any experience with anything like this on their own site or customer site that we can learn from just to ensure that we avoid any chances of being flagged for being a link farm? Thank you 🙂
White Hat / Black Hat SEO | | AU-SEO0 -
Recovering from Google Penguin/algorithm penalty?
Anyone think recovery is possible? My site has been in Google limbo for the past 8 months to around a year or so. Like a lot of sites we had seo work done a while sgo and had tons of links that Google now looks down on. I worked with an seo company for a few months now and they seem to agree Penguin is the likely culprit, we are on page 8-10 for keywords that we used to be on page 1 for. Our site is informative and has everything in tact. We deleted whatever links possible and some sites are even hard to find contact information for and some sites want money, I paid a few a couple bucks in hopes maybe it could help the process. Anyway we now have around 600 something domains on disavow file we out up in March-April, with around 100 or 200 added recently as well. If need be a new site could be an option as well but will wait and see if the site can improve on Google with a refresh. Anyone think recovery is possible in a situation like this? Thanks
White Hat / Black Hat SEO | | xelaetaks0 -
Does Google+ make a huge difference?
I run a website that's been ranked well for good keywords related to our business for some time. It was founded back in 2007 and has been there a while. Recently a new site has popped up that ranks brilliantly for everything. It's a new site, and the only redeeming factor I can see is that it has an AddThis box showing the Facebook Likes and Google Plus Ones, and they are around 400 Facebook Likes and 80 Google+ (for every page that ranks). Any other pages on their site which doesn't have any Facebook likes or Google Plus Ones, they don't rank. Our site doesn't have any likes or pluses. Is this making the difference? I stress that other than this our sites are very similar, other than the fact we've been around over 5 years.
White Hat / Black Hat SEO | | freebetinfo0 -
Check For Bad Directory Backlinks For Free
I used http://deletebacklinks.com/ yesterday to search 7 of the directories they have access to for searching bad links. I found one of my sites had links on these directories and I was able to remove them for fairly reasonable price. Thought this is a good tool to do a free quick check for any bad linkbacks on deindexed directories. I know this may be a small portion but every little bit helps.
White Hat / Black Hat SEO | | TheSEODR0 -
Is widget linkbaiting a bad idea now that webmasters are getting warnings of unnatural links?
I was reading this article about how many websites are being deindexed because of an unnatural linking profile and it got me thinking about some widgets that I have created. In the example given, a site was totally deindexed and the author believes the reason was because of multiple footer links from themes that they created. I have one site that has a very popular widget that I offer to others to embed into their site. The embed code contains a line that says, "Tool provided by Site Name". Now, it just so happens that my site name contains my main keyword. So, if I have hundreds of websites using this tool and linking back to me using the same anchor text, could Google see this as unnatural and possibly deindex me? I have a few thoughts on what I should do but would love to hear your thoughts: 1. I could use a php script to provide one of several different anchor text options when giving my embed code. 2. I could change the embed code so that the anchor text is simply my domain name, ie www.mywebsitename.com rather than "my website name". Thoughts?
White Hat / Black Hat SEO | | MarieHaynes1 -
How google treats RSS fetcher?
All I want to know how google treats RSS fetcher. I want to push my blogs to my own website. Both are there on the same domain . But I want them to be updated automatically on the home page of my website through RSS fetcher if i create it on my blog page. My site name is http://www.myrealdata.com and my blog site name is http://www.myrealdata.com/blog
White Hat / Black Hat SEO | | SangeetaC0 -
Google Places
My client offers training from many locations within the UK. These locations/venues are not owned by them, however I see no problem in setting up a different listing for each location in Google Places. At the end of the day if a user searched for “Training London” they are looking for somewhere that they can book a course that would be in their local area. As my client has a “venue” there I think there is a good argument to say that your listing would be valid. What are your thoughts.
White Hat / Black Hat SEO | | cottamg0