Is bad English detected by Google
-
Hi,
I am based in the UK and in a very competitive market - van leasing - and I am thinking about using an Indian SEO company for my ongoing SEO.
They have sent me some sample artilces that they have written for link building and the English is not good.
Do you think that google can tell the difference between a well written article and a poorly written article? Will the fact that articles are poorly writtem mean we will lose potential value from the link?
Any input would be much appreciated.
Regards
John J
-
Thanks for the responses. I think I will stay away from the Indian SEO companies.
It really was for link building and not onsite stuff but it still does not seem like the best way forward.
Regards
John
-
Matt Cutts has stated in the past that poorly translating pages into another language (i.e. dumping out a raw translation) could get you devalued. Now, he's talking primarily about duplicate content but it seems that he's hinting that poor grammar could also play a role in evaluations. At the bare minimum, it could affect your bounce rate, a known SEO factor.
Let's put aside the SEO role for a second. I'm a customer who just found your site, written by your India firm. The grammar looks worse than my daughter's (she's in first grade) and is a chore to read, let alone understand. Am I going to stay and listen to/buy anything else on your site? Nope. I'll go to your competitor or I'll just give up. And you can forget any tertiary SEO benefit of my linking your article except to ridicule it. From a business standpoint it doesn't make sense. It's sloppy and people hate sloppy (unless you're selling really good hamburgers, which you're not).
If you still don't think it's important, check out Engrish. I hope you don't wind up there!
-
I agree w/ @kevinb. Google & Bing track results like high user engagement, low bounce rates, etc. Check out the infographic below.
If these articles aren't useful to users, Google will notice.
-
Awkward syntax and poor or incorrect use of idiom erect roadblocks to the flow of a narrative, depreciating the user experience.
It's been my experience that when a writer attempts to replicate a particular cultural context that is not natural to him or her, the user will recognize its artificiality—even if only on a subconscious level. An analogy would be a motion picture with dubbed—rather than subtitled—dialog: There's something that's just off.
According to Google user experience trumps, doesn't it? (See, I used an idiom right there!) So, for what its worth my advice would be to stay away.
-
Even if Google can't detect poor English now, it will be working towards it.
Surely your money is better spent elsewhere. Invest in the long term.
If the articles they are writing for you are low quality, you can bet the sites they are able to get them on are low too.
Keep away from them and work on quality. Nothing is quick and easy and that's how it should be. If people could so easily buy their way to the top, the search results wouldn't be worth using.
-
Do yourself a favour, stay away from this out-dated and damaging technique!
Create some amazing content on your own site/blog......examples could be how to reduce insurance costs when leasing a van or the best vans to hire for home removals etc etc.
Make your content the go to source for that particular problem then start contacting other webmasters of similar (non-competitor) sites to share/link so their readers benefit!
The game has changed a lot from when you could buy 50 articles from Indian SEO firms for less than £20 and churn out for links from low quality sites!
-
Wesley & Jesse hit the nail on the head. Don't do it. Even if Google possible can't detect it directly, they can spot it indirectly in the means of user experience.
Is the only reason you are using this team is price?
-
I'm not sure if Google if able to tell the difference between good or bad English at this moment.
But i do know that this is one of the criteria which they want a website to rank as is described in this document about Google Panda: http://googlewebmastercentral.blogspot.nl/2011/05/more-guidance-on-building-high-quality.htmlThis method is not permitted though and you may have a benefit for this on the short term, but i can tell you that it won't be long before you will get a penalty for this technique. Link building is not about buying links in any form. It's about creating awesome content that people want to share just because they think it is awesome.
Of course reaching out to people is also part of the process. But the key is always to make sure that you have to create a site that people **want **to link to because it is awesome of because their website will get better from it because your website offers great value to their visitors.
Always keep this in mind
-
What Google definitely does recognize is the exact services you are considering. Google's webspam team developed Penguin specifically to target sites that have subbed out SEO to blackhat organizations. What you are describing is exactly what they are targeting.
Don't do it! You WILL be sorry.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Flurry of thousands of bad links from 3 Spammy websites. Disavow?
I also discovered that a website www.prlog.ru put 32 links to my website. It is a russian site. It has a 32% spam score. Is that high? I think I need to disavow. Another spammy website link has spam score of 16% with with several thousand links. I added one link to the site medexplorer.com 6 years ago and it was fine. Now it has thousands of links. Should I disavow all three?
White Hat / Black Hat SEO | | Boodreaux0 -
What is the best way to eliminate ghost traffic from Google Analytics?
Hey Mozzers, I just wanted to see how you all deal with eliminating Google ghost traffic sources from Google. I tried setting up a RegEx 'include' list before, but it seemed as though I was blocking potential traffic sources when I did as much (I'm probably missing something here). Anyway, I'm interested to read how you all have dealt with this issue in the past, thanks for reading!
White Hat / Black Hat SEO | | maxcarnage0 -
Why homepage is not getting cached by Google ?
It has been more than 2-3 months that I didn't notice that our website homepage is not getting cached by Google ?? i don't know why?? help me please, thanks in advance. Regards,
White Hat / Black Hat SEO | | spellblaster
Spel why.PNG0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Link Audit: How do I decide what is a good or bad link?
I am conducting a link audit for one of my formerly high-ranking pages. But despite reading quite a bit on the issue, I am still quite confused as to how to decide whether to keep or remove a link. Some links come from directories and social bookmarking sites. I know that generally speaking, you do not want to be on these types of sites, but what if their domain authorities, pageranks, and mozTrusts scores are good? For example, here is one of my links for "envelopes": http://www.folkd.com/detail/www.jampaper.com%2FEnvelopes The page itself has no MozRank, MozTrust, or links but the domain has an authority of 88, a MozRank of 6.41, a mozTrust of 6.31. Should I be looking on a page level or domain level basis? It also has over 5 million links, with over two million of those being external followed links. Is the high quantity of links a warning sign? I also used a free online tool (thesitevalue.com) to determine how much traffic the domain gets. Apparently it receives over 350,000 unique visits daily, so it must be useful to people. This, combined with the fact that we've received 5 visits from the link over the last year (not a lot, but something), makes me believe that the link's intent wasn't purely to "trick" Google. Despite this, I still have a feeling the link could be considered low-quality based on the domain's appearance. Similarly, some of our links are coming from domains named linkdirect.info, backlinks8.com, tolinkup.com, findyourlink.info, searchengineurl.com, websubmissionfree.com. Is it safe to assume these are harmful links strictly because of their names? Thank you!
White Hat / Black Hat SEO | | jampaper0 -
Should I report this to Google and will anything happen ?
Hi, I am working with a client and have discovered that a direct competitor has hidden the clients business name in meta information and also hidden the name on the page but off to the side. My intention is to ask the company to remove the content, but the client would like me to report it to Google. Is this a waste of time and what request in webmaster tools should I use. The name is not a trademark but the business name is not generic and it is an obvious attempt to target my clients business. Any help would be appreciated, Thanks in advance
White Hat / Black Hat SEO | | Mozzi0 -
Backlinks According to Google
Good Morning, Google has just recognized some links going to my site. I used a seo toolbar downloaded from firefox that informed me of the Links according to Google. My question is that them links have been there for ages and Google has only just recognized them. Is there a reason for this? Does Google only show links quarterly or half yearly? Thanks SEO_123
White Hat / Black Hat SEO | | TWPLC_seo0 -
EXPERT CHALLENGE: What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change?
FOR ALL SEO THOUGHT LEADERS...What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change? NOTE: My hope is that the responses left on this thread will ultimately benefit all members of the community and give recognition to the true thought leaders within the SEO space. That being said, my challenge is a 2 part question: With the 80/20 rule in mind, and in light of recent algorithm changes, what would YOU focus most of your SEO budget on if you had to choose? Let's assume you're in a competitive market (ie #1-5 on page 1 has competitors with 20,000+ backlinks - all ranging from AC Rank 7 to 1). How would you split your total monthly SEO budget as a general rule? Ex) 60% link building / 10% onsite SEO / 10% Social Media / 20% content creation? I realize there are many "it depends" factors but please humor us anyways. Link building appears to have become harder and harder as google releases more and more algorithm changes. For link building, the only true white hat way of proactively generating links (that I know of) is creating high quality content that adds value to customers (ie infographics, videos, etc.), guest blogging, and Press Releases. The con to these tactics is that you are waiting for others to find and pick up your content which can take a VERY long time, so ROI is difficult to measure and justify to clients or C-level management. That being said, how are YOU allocating your link building budget? Are all of these proactive link building tactics a waste of time now? I've heard it couldn't hurt to still do some of these, but what are your thoughts and what is / isn't working for you? Here they are: A. Using spun articles edited by US based writers for guest blog content B. 301 Redirects C. Social bookmarking D. Signature links from Blog commenting E. Directory submissions F. Video Submissions G. Article Directory submissions H. Press release directory submissions I. Forum Profile Submissions J. Forum signature links K. RSS Feed submissions L. Link wheels M. Building links (using scrapebox, senukex, etc.) to pages linked to your money site N. Links from privately owned networks (I spoke to an SEO company that claims to have over 4000 unique domains which he uses to boost rankings for his clients) O. Buying Contextual Text Links All Expert opinions are welcomed and appreciated 🙂
White Hat / Black Hat SEO | | seoeric2