Is bad English detected by Google
-
Hi,
I am based in the UK and in a very competitive market - van leasing - and I am thinking about using an Indian SEO company for my ongoing SEO.
They have sent me some sample artilces that they have written for link building and the English is not good.
Do you think that google can tell the difference between a well written article and a poorly written article? Will the fact that articles are poorly writtem mean we will lose potential value from the link?
Any input would be much appreciated.
Regards
John J
-
Thanks for the responses. I think I will stay away from the Indian SEO companies.
It really was for link building and not onsite stuff but it still does not seem like the best way forward.
Regards
John
-
Matt Cutts has stated in the past that poorly translating pages into another language (i.e. dumping out a raw translation) could get you devalued. Now, he's talking primarily about duplicate content but it seems that he's hinting that poor grammar could also play a role in evaluations. At the bare minimum, it could affect your bounce rate, a known SEO factor.
Let's put aside the SEO role for a second. I'm a customer who just found your site, written by your India firm. The grammar looks worse than my daughter's (she's in first grade) and is a chore to read, let alone understand. Am I going to stay and listen to/buy anything else on your site? Nope. I'll go to your competitor or I'll just give up. And you can forget any tertiary SEO benefit of my linking your article except to ridicule it. From a business standpoint it doesn't make sense. It's sloppy and people hate sloppy (unless you're selling really good hamburgers, which you're not).
If you still don't think it's important, check out Engrish. I hope you don't wind up there!
-
I agree w/ @kevinb. Google & Bing track results like high user engagement, low bounce rates, etc. Check out the infographic below.
If these articles aren't useful to users, Google will notice.
-
Awkward syntax and poor or incorrect use of idiom erect roadblocks to the flow of a narrative, depreciating the user experience.
It's been my experience that when a writer attempts to replicate a particular cultural context that is not natural to him or her, the user will recognize its artificiality—even if only on a subconscious level. An analogy would be a motion picture with dubbed—rather than subtitled—dialog: There's something that's just off.
According to Google user experience trumps, doesn't it? (See, I used an idiom right there!) So, for what its worth my advice would be to stay away.
-
Even if Google can't detect poor English now, it will be working towards it.
Surely your money is better spent elsewhere. Invest in the long term.
If the articles they are writing for you are low quality, you can bet the sites they are able to get them on are low too.
Keep away from them and work on quality. Nothing is quick and easy and that's how it should be. If people could so easily buy their way to the top, the search results wouldn't be worth using.
-
Do yourself a favour, stay away from this out-dated and damaging technique!
Create some amazing content on your own site/blog......examples could be how to reduce insurance costs when leasing a van or the best vans to hire for home removals etc etc.
Make your content the go to source for that particular problem then start contacting other webmasters of similar (non-competitor) sites to share/link so their readers benefit!
The game has changed a lot from when you could buy 50 articles from Indian SEO firms for less than £20 and churn out for links from low quality sites!
-
Wesley & Jesse hit the nail on the head. Don't do it. Even if Google possible can't detect it directly, they can spot it indirectly in the means of user experience.
Is the only reason you are using this team is price?
-
I'm not sure if Google if able to tell the difference between good or bad English at this moment.
But i do know that this is one of the criteria which they want a website to rank as is described in this document about Google Panda: http://googlewebmastercentral.blogspot.nl/2011/05/more-guidance-on-building-high-quality.htmlThis method is not permitted though and you may have a benefit for this on the short term, but i can tell you that it won't be long before you will get a penalty for this technique. Link building is not about buying links in any form. It's about creating awesome content that people want to share just because they think it is awesome.
Of course reaching out to people is also part of the process. But the key is always to make sure that you have to create a site that people **want **to link to because it is awesome of because their website will get better from it because your website offers great value to their visitors.
Always keep this in mind
-
What Google definitely does recognize is the exact services you are considering. Google's webspam team developed Penguin specifically to target sites that have subbed out SEO to blackhat organizations. What you are describing is exactly what they are targeting.
Don't do it! You WILL be sorry.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Should you include keywords in your domain name to rank well on Google Places?
Is it okay to include keywords in your domain name (as well as business name) to rank well on Google Places? In my opinion, this is very spammy and the sites using this technique will be slapped by Google sooner or later.
White Hat / Black Hat SEO | | thegoatman1 -
Where can i see ejemple of disavow files to adapt mine in order to send to google
Can i send a disavow file to google as CSV file. Where can i see ejemple of disavow files to adapt mine in order to send to google
White Hat / Black Hat SEO | | maestrosonrisas0 -
Google Local Listing Verification - Is there a way to skip this?
Hi, We are running 2 types of service in our company. 1.) Dry Cleaning 2.) Laundry Services The problem is we have 2 website but only 1 office address.
White Hat / Black Hat SEO | | chanel27
It is not recommended to put same address for the both websites
both doing laundry & dry cleaning services. Is there any tip on how we can get listed on Google place without using the same address for both website?0 -
Someone COPIED my entire site on Google- what should I do?
I purchased a very high ranked and old site a year or so ago. Now it appears that the people I purchased from completely copied the site all graphics and content. They have now built that site up high in rankings and I dont want it to compromise my site. These sites look like mirror images of each other What can I do?
White Hat / Black Hat SEO | | TBKO0 -
Content box (on page content) and titles Google over-optimization penalty?
We have a content box at the bottom of our website with a scroll bar and have posted a fair bit of content into this area (too much for on page) granted it is a combination of SEO content (with links to our pages) and informative but with the over optimization penalty coming around I am a little scared if this will result in a problem for us. I am thinking of adopting the process of this website HERE with the content behind a more information button that drops down, would this be better as it could be much more organised and we will be swopping out to more helpful information than the current 50/50 (SEO – helpful content) or will it be viewed the same and we might as well leave it as is and lower the amount of repetition and links in the content. Also we sell printed goods so our titles may be a bit over the top but they are bring us a lot of converting traffic but again I am worried about the new Google release this is an example of a typical title (only an example not our product page) Banner Printing | PVC Banners | Outdoor Banners | Backdrops | Vinyl Banners | Banner Signs Thank you for any help with these matters.
White Hat / Black Hat SEO | | BobAnderson0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0 -
Linking Profile Gone Bad?!
Recently, I was looking over the linking profile for one of our large clients, and I noticed that a ton of spammy links were appearing. I have never purchase any links or done anything shady that would contribute to this large increase in bad links. It appears as though someone is trying to hijack the SEO of this company, and I don't know how to proceed. Currently, they have not been penalized by Google, but I would not be surprised if a penalty is on its way due to the obvious link spam. Is there any way to report this to Google to ensure that no penalties occer? Any advice on the issue is much welcomed! Thanks
White Hat / Black Hat SEO | | tqinet0