Google Bombing For A Specific URL
-
The term "Beruk" which means "Ape or Monkey" in english brings up this page of wikipedia amongst the first page result: URL: http://en.wikipedia.org/wiki/Khairy_Jamaluddin
The page does not contain the word "Beruk".
External links to the page do not contact the anchor-text "Beruk"
Given the above scenario, how is the page still ranking on first page for this keyword?
-
Hi Dunamis,
I was wondering about the same. If Google sees historical search queries using "Term A" + "Term B" would it associate both the words so strongly.. that even when only one of the search term is being used the others relevance still gets quantifies? Hard to believe Google could be going something like this.
Still an open question for now. Let's see if we get any more explanations.
-
Thanks Ryan for the historical wiki dig. But I doubt Google would have something so old still influence todays results (especially when the page was edited a long time ago removing all traces of the word "Beruk")
However this could be one of the possible explanations.
-
This could also have something to do with how Google determines relevance. If a user types in "black cat", sees their search results and then immediately goes back and types in "black kitten", Google can determine that cat and kitten are relevant. If enough people do it they will figure out that when someone types cat, they could mean kitten. The algorithm is more complicated than that though and Google is always learning.
So, I would guess that Google has figured out that when someone searches for Beruk, that that word is really relevant to the word monkey. And then, the Wikipedia page is very relevant to monkies, especially the type of monkeys that people are looking for when they type in Beruk.
-
If you google phases with N.I. in them google shows results with Northern Ireland in the serps (bolded and all), maybe google doing something similar here?
-
Wikipedia is such an incredible strong site that Google clearly places them on a pedastal. This is purely a case of domain rank. To learn why the term Beruk is associated with that page you need to look at the page's history. In July 2008, about 150 page edits ago, a wiki reader decided to edit the page and use the term "beruk" as an insult. That is how the term became associated with the page. http://en.wikipedia.org/w/index.php?title=Khairy_Jamaluddin&oldid=226010042 This page would be a good example for the Google team to examine and then adjust their metrics.
-
Apparently, in 2007 Jamaluddin was involved in some kind of controversy concerning an HIV-positive monkey (http://ms.wikipedia.org/wiki/Khairy_Jamaluddin#Isu_beruk, I used Google Translate but it's not very clear).
Possibly a lot of pages just link to his wiki article using the work Beruk as past of the anchor text, or maybe even just as words surrounding the anchor text
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have 4012 links from one blog - will Google penalise?
My website (http://www.gardenbeet.com) has 4012 links from http://cocomale.com/blog/ to my home page -a banner advert links from the blog - I also have 3,776 from another website to 6 pages of my website 1,832 from pinterest to 183 pages etc etc overall there are 627 domains linking to my website I have been advised by a SEO company that I was penalised in about may to july 2012 due to a large number of links coming from one domain or two domains is that true? should I ask the blog owner to remove my link?
White Hat / Black Hat SEO | | GardenBeet0 -
Google says 404s don't cause ranking drops, but what about a lot of them
Hello, According to Google here, 404s don't cause rankings to go down. Our rankings are going down and we have about 50 or so 404s (though some may have been deindexed by now). We have about 300 main products and 9000 pages in general on this Ecommerce site. There's no link equity gained by 301 redirecting the 404s. A custom 404 page has been made linking to the home page. There's nothing linking to the pages that are 404s Provided that no more 404s are created, can I just ignore them and find the real reason our rankings are going down?
White Hat / Black Hat SEO | | BobGW0 -
Can i 301 redirect a website that does not have manual penalty - but definetly affected by google
ok, i have a website (website A) which has been running since 2008, done very nicely in search results, until january of this year... it dropped siginificantly, losing about two thirds of visitors etc... then in may basically lost the rest... i was pulling my hair out for months trying to figure out why, i "think" it was something to do with links and anchor text, i got rid of old SEO company, got a new SEO company, they have done link analysis, trying to remove lots of links, have dissavowed about 500 domains... put in a reconsideration request... got a reply saying there is no manual penalty... so new seo company says all they can do is carry on removing links, and wait for penguin to update and hopefully that will fix it... this will take as along as it takes penguin to update again... obviously i can not wait indefinetely, so they have advised i start a new website (website B)... which is a complete duplicate of website A. Now as we do not know whats wrong with website A - (we think its links - and will get them removed) my seo company said we cant do a 301 redirect, as we will just cause what ever is wrong to pass over to website B... so we need to create a blank page for every single page at website A, saying we have moved and put a NO FOLLOW link to the new page on website B.... Personally i think the above will look terrible, and not be a very user friendly experience - but my seo company says it is the only way to do it... before i do it, i just wanted to check with some experts here, if this is right? please advise if 301 redirects are NOT correct way to do this. thanks
White Hat / Black Hat SEO | | isntworkdull
James0 -
Negative SEO attack working amazingly on Google.ca
We have a client www.atvandtrailersales.com who recently (March) fell out of the rankings. We checked their backlink file and found over 100 spam links pointing at their website with terms like "uggboots" and "headwear" etc. etc. I submitted a disavow link file, as this was obviously an attack on the website. Since the recent Panda update, the client is back out of the rankings for a majority of keyword phrases. The disavow link file that was submitted back in march has 90% of the same links that are still spamming the website now. I've sent a spam report to Google and nothing has happened. I could submit a new disavow link file, but I'm not sure if this is worth the time. '.'< --Thanks!
White Hat / Black Hat SEO | | SmartWebPros1 -
Starting fresh on a new url after serious Penguin update down rank
Hi friends My site www.acupunctureclinicvictoriabc.com was recently hit by the penguin update and i dropped to page 5 of local searchs for my key words. A while back I had some bad link building done and now paying for it:( I thought the disavow tool (used 4 months ago) would deal with this issue but apparently not The current url is feeling like a lost cause. My question is if I start fresh on a new url, can I use my old content (or even clone the site and move it to a new url) without being punished for duplicate content on the new site? Any recommendations for starting fresh? I really appreciate any thoughts on this matter, as I am feeling a bit lost and bummed about this issue thanks!
White Hat / Black Hat SEO | | Silasrose0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Link Building after Google updates!
Hello All, I just wanted to ask the question to start a discussion on link building after the Google Updates. I haven't been very proactive lately with regards to link building due to the updates and not wanting to get penalised! Are there any link building trends/techniques people are using since the changes? Thanks, seo_123
White Hat / Black Hat SEO | | TWPLC_seo0