Google Bombing For A Specific URL
-
The term "Beruk" which means "Ape or Monkey" in english brings up this page of wikipedia amongst the first page result: URL: http://en.wikipedia.org/wiki/Khairy_Jamaluddin
The page does not contain the word "Beruk".
External links to the page do not contact the anchor-text "Beruk"
Given the above scenario, how is the page still ranking on first page for this keyword?
-
Hi Dunamis,
I was wondering about the same. If Google sees historical search queries using "Term A" + "Term B" would it associate both the words so strongly.. that even when only one of the search term is being used the others relevance still gets quantifies? Hard to believe Google could be going something like this.
Still an open question for now. Let's see if we get any more explanations.
-
Thanks Ryan for the historical wiki dig. But I doubt Google would have something so old still influence todays results (especially when the page was edited a long time ago removing all traces of the word "Beruk")
However this could be one of the possible explanations.
-
This could also have something to do with how Google determines relevance. If a user types in "black cat", sees their search results and then immediately goes back and types in "black kitten", Google can determine that cat and kitten are relevant. If enough people do it they will figure out that when someone types cat, they could mean kitten. The algorithm is more complicated than that though and Google is always learning.
So, I would guess that Google has figured out that when someone searches for Beruk, that that word is really relevant to the word monkey. And then, the Wikipedia page is very relevant to monkies, especially the type of monkeys that people are looking for when they type in Beruk.
-
If you google phases with N.I. in them google shows results with Northern Ireland in the serps (bolded and all), maybe google doing something similar here?
-
Wikipedia is such an incredible strong site that Google clearly places them on a pedastal. This is purely a case of domain rank. To learn why the term Beruk is associated with that page you need to look at the page's history. In July 2008, about 150 page edits ago, a wiki reader decided to edit the page and use the term "beruk" as an insult. That is how the term became associated with the page. http://en.wikipedia.org/w/index.php?title=Khairy_Jamaluddin&oldid=226010042 This page would be a good example for the Google team to examine and then adjust their metrics.
-
Apparently, in 2007 Jamaluddin was involved in some kind of controversy concerning an HIV-positive monkey (http://ms.wikipedia.org/wiki/Khairy_Jamaluddin#Isu_beruk, I used Google Translate but it's not very clear).
Possibly a lot of pages just link to his wiki article using the work Beruk as past of the anchor text, or maybe even just as words surrounding the anchor text
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much does doing google search queries dilute your search console data
So, does performing dozens or hundreds of search queries a day dilute your search console data, or does google filter this out or how does this work exactly? When you do an icognito search and click on your site does this information get recorded in search console?
White Hat / Black Hat SEO | | jfishe19880 -
Google URL Shortener- Should I use one or multiple???
I have a client with a number of YouTube videos. I'm using Google URL Shortner to allow the link to show in the YouTube text (as its a long URL). Many of these links go to the same page ex .com/services-page Should I use a single short URL for each video linking to the .com/services-page or should they be unique each time? If unique, would Google possibly think I'm trying to manipulate results? Thanks in advance. I'm just not sure on this one and hope someone knows best practice on this. Thanks!
White Hat / Black Hat SEO | | mgordon1 -
Google Manual Penalty - Dilemma?
Hi Guys, A while back, my company had a 'partial match' manual penalty from google for 'unnatural links' pointing to our site. This glorious feat was accomplished by our previous SEO agency for quite heavily spamming links (directories, all kinds of low quality sites). That being said, when the penalty hit we really didnt see any drop in traffic. In fact, it was not long after the penalty that we launched a new website and since our traffic has grown quite significantly. we've doubled our total visits from prior penalty to now. This previous SEO also did submit a couple of reconsideration requests (both done loosely as to fool Google by only removing a small amount of links, then abit more the next time when it failed - this was obviously never going to work). Since then, I myself have submitted a reconsideration request which was very thorough, disavowing 85 Domains (every single one at domain level rather than the individual URLs as I didnt want to take any chances), as well as getting a fair few links removed from when the webmaster responded. I documented this all and made multiple contacts to the webmasters so i could show this to Google. This reconsideration request was not successful - Google made some new backlinks magically appear that i had not seen previously. But really, my main point is; am I going to do more damage removing more and more links in order to remove the penalty, because as it stands we haven't actually noticed any negative effects from the penalty! Perhaps the negative effects have not been noticed due to the fact that not long after the penalty, we did get a new site which was much improved and therefore would naturally get much more traffic than the old site, but overall it has not been majorly noticed. What do you guys think - is it worth risking drop in rankings to remove the penalty so we don't face any future issues, or should I not go too heavy with the link removal in order to preserve current rankings? (im really interested to see peoples views on this, so please leave a comment if you can help!)
White Hat / Black Hat SEO | | Sandicliffe0 -
Is horizontal hashtag linking between 4 different information text pages with a canonical tag to the URL with no hashtag, a White Hat SEO practice?
Hey guys, I need help. hope it is a simple question : if I have horizontal 4 text pages which you move between through hashtag links, while staying on the same page in user experience, can I canonical tag the URL free of hashtags as the canonical page URL ? is this white hat acceptable practice? and will this help "Adding the Value", search queries, and therefore rank power to the canonical URL in this case? hoping for your answers. Best Regards, and thanks in advance!
White Hat / Black Hat SEO | | Muhammad_Jabali0 -
Penguin Update or URL Error - Rankings Tank
I just redid my site from Godaddy Quick Shopping Cart to Drupal. The site is much cleaner now. I transferred all the content. Now my site dropped from being in the top ten on almost every key word we were targeting to 35+. I "aliased" the urls so that they were the same as the Godaddy site. However when I look at our search results I notice that our URLs have extra wording at the end like this: ?categoryid=1 or some other number. Could this be the reason that our rankings tanked? Previously on the godaddy site the results didnt show this.
White Hat / Black Hat SEO | | chronicle0 -
Google Penalising Pages?
We run an e-commerce website that has been online since 2004. For some of our older brands we are getting good rankings for the brand category pages and also for their model numbers. For newer brands, the category pages aren't getting rankings and neither are the products - even when we search for specific unique content on that page, Google does not return results containing our pages. The real kicker is that the pages are clearly indexed, as searching for the page itself by URL or restricting the same search using the site: modifier the page appears straight away! Sometimes the home page will appear on page 3 or 4 of the rankings for a keyword even though their is a much more relevant page in Google's index from our site - AND THEY KNOW IT, as once again restricting with the keywords with a site: modifier shows the obviously relevant page first and loads of other pages before say the home page or the page that shows. This leads me to the conclusion that something on certain pages is flagging up Google's algorithms or worse, that there has been manual intervention by somebody. There are literally thousands of products that are affected. We worry about duplicate content, but we have rich product reviews and videos all over these pages that aren't showing anywhere, they look very much singled out. Has anybody experienced a situation like this before and managed to turn it around? Link - removed Try a page in for instance the D&G section and you will find it easily on Google most of the time. Try a page in the Diesel section and you probably won't, applying -removed and you will. Thanks, Scott
White Hat / Black Hat SEO | | scottlucas0