Google Bombing For A Specific URL
-
The term "Beruk" which means "Ape or Monkey" in english brings up this page of wikipedia amongst the first page result: URL: http://en.wikipedia.org/wiki/Khairy_Jamaluddin
The page does not contain the word "Beruk".
External links to the page do not contact the anchor-text "Beruk"
Given the above scenario, how is the page still ranking on first page for this keyword?
-
Hi Dunamis,
I was wondering about the same. If Google sees historical search queries using "Term A" + "Term B" would it associate both the words so strongly.. that even when only one of the search term is being used the others relevance still gets quantifies? Hard to believe Google could be going something like this.
Still an open question for now. Let's see if we get any more explanations.
-
Thanks Ryan for the historical wiki dig. But I doubt Google would have something so old still influence todays results (especially when the page was edited a long time ago removing all traces of the word "Beruk")
However this could be one of the possible explanations.
-
This could also have something to do with how Google determines relevance. If a user types in "black cat", sees their search results and then immediately goes back and types in "black kitten", Google can determine that cat and kitten are relevant. If enough people do it they will figure out that when someone types cat, they could mean kitten. The algorithm is more complicated than that though and Google is always learning.
So, I would guess that Google has figured out that when someone searches for Beruk, that that word is really relevant to the word monkey. And then, the Wikipedia page is very relevant to monkies, especially the type of monkeys that people are looking for when they type in Beruk.
-
If you google phases with N.I. in them google shows results with Northern Ireland in the serps (bolded and all), maybe google doing something similar here?
-
Wikipedia is such an incredible strong site that Google clearly places them on a pedastal. This is purely a case of domain rank. To learn why the term Beruk is associated with that page you need to look at the page's history. In July 2008, about 150 page edits ago, a wiki reader decided to edit the page and use the term "beruk" as an insult. That is how the term became associated with the page. http://en.wikipedia.org/w/index.php?title=Khairy_Jamaluddin&oldid=226010042 This page would be a good example for the Google team to examine and then adjust their metrics.
-
Apparently, in 2007 Jamaluddin was involved in some kind of controversy concerning an HIV-positive monkey (http://ms.wikipedia.org/wiki/Khairy_Jamaluddin#Isu_beruk, I used Google Translate but it's not very clear).
Possibly a lot of pages just link to his wiki article using the work Beruk as past of the anchor text, or maybe even just as words surrounding the anchor text
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Strategies to recover from a Google Penalty?
2 years ago we took over a client who had a hacked site and also had signed up with a black hat SEO team that set up 50 spammy directory links back to the site. Since then we have cleaned up the hacks, had the site reviewed by Google and readded to the Search Index, and disavowed all the directory links through GWT. Over the last 2 years, we've encouraged the client to create new content and have developed a small but engaged social following. The website is www.fishtalesoutfitting.com/. The site's domain authority is 30, but it struggles to rank higher than 20 for even uncompetitive long tail keywords. Other sites with much lower domain authorities outrank the site for our primary keywords. We are now overhauling the site design and content. We are considering creating an entirely new URL for the primary domain. We would then use 301 redirects from the old url to the new. We'd welcome insight into why the current site may still be getting penalized, as well as thoughts on our strategy or other recommendations to recover from the events of 2 years ago. Thank you.
White Hat / Black Hat SEO | | mlwilmore0 -
Pleasing the Google Gods & Not DeIndexing my site.
Hey Mozzers, So plenty of you who follow these threads have come across my posts and have read bits and pieces of the strange dark dark gray hat webspace that I have found myself in. So I'm currently doing some research and I wanted all of your opinion too. Will Google always notify you before they stop indexing your website? Will Google always allow you back if you do get pulled? Does Google give a grace period where they say "fix in 30 days?"? What is every bodies experience with all of this?
White Hat / Black Hat SEO | | HashtagHustler0 -
Strange strategy from a competitor. Is this "Google Friendly"?
Hi all,We have a client from a very competitive industry (car insurance) that ranks first for almost every important and relevant keyword related to car insurance.
White Hat / Black Hat SEO | | sixam
But they could always be doing a good job. A few days ago i found this: http://logo.force.com/ The competitor website is: http://www.logo.pt/ The competitor name is: Logo What I found strange is the fact that both websites are the same, except the fact that the first is in a sub-domain and have important links pointing to the original website (www.logo.pt) So my question is, is this a "google friendly" (and fair) technique? why this competitor has such good results? Thanks in advance!! I look forward to hearing from you guys0 -
Should I Even Bother Trying To Recover This Site After Google Penguin?
Hello all, I would like to get your opinion on whether I should invest time and money to improve a website which was hit by Google Penguin in April 2014. (I know, April 2014 was nearly 2 years ago. However, this site has not been a top priority for us and we have just left until now). The site is www.salmonrecipes.net Basically, we aggregated over 700 salmon recipes from major supermarkets, famous chefs, and others (all with their permission) and made them available on this site. It was a good site at the time but it is showing its age now. For a few years we were occasionally #1 on Google in the US for "salmon recipes", but normally we would be between #2 and #4. We made money from the site almost entirely through Adsense. We never made a huge amount, but it paid our office rent every month, which was handy. We also built up an email database of several thousand followers, but we've not really used this much. (Yet). In the year from 25th April 2011 to 24th April 2012 the site attracted just over 500k visits. After the rankings dropped due to Google Penguin, traffic dropped by 77% in the year from 25th April 2011 to 24th April 2012. Rankings and traffic have not recovered at all, and are only getting worse. I am happy to accept that we deserved our rankings to fall during the Google Penguin re-shuffle. I stupidly commissioned an offshore company to build lots of links which, in hindsight, were basically just spam, and totally without any real value. However they assured me it was safe and I trusted them, despite my own nagging reservations. Anyway, I have full details of all the links they created, and therefore I could remove many of these 'relatively' easily. (Of course, removing hundreds of links would take a lot of time). My questions ... 1. How can I evaluate the probability of this site 'recovering' from Google Penguin. I am willing to invest time/money on link removal and new (ethical) SEO work if there is a reasonable chance of regaining a position in the top 5 on Google (US) for "salmon recipes" and various long-tail terms. But I am keen to avoid spending time/money on this if it is unlikely we will recover. How can I figure out my chances? 2. Generally, I accept that this model of site is in decline. Relying on Google to drive traffic to a site, and on Google to produce revenue via its Adsense scheme, is risky and not entirely sensible. Also, Google seems to provide more and more 'answers' itself, rather than sending people to e.g. a website listing recipes. Given this, is it worth investing any money in this at all? 3. Can you recommend anyone who specialises in this kind of recovery work. (As I said, I have a comprehensive list of all the links that were built, etc). OK, that is all for now. I am really looking forward to whatever opinions you may have about this. I'll provide more info if required. Huge thanks
White Hat / Black Hat SEO | | smaavie
David0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
What Google considers to be a branded keyword?
We can set our own keywords as branded in SeoMoz campaign, but Google would not necessarily see them like branded. After reading the Blog post at http://www.seomoz.org/blog/how-wpmuorg-recovered-from-the-penguin-update I had a question: Are there known rules (or at least guesses) what Google considers a branded keyword/anchor text? I guess the first one would be your website domain. So bluewidget.com for example would be a branded keyword for bluewidget.com website. How about Blue Widget or Blue Widget Company?
White Hat / Black Hat SEO | | SirMax0 -
Will Google Penalize Content put in a Div with a Scrollbar?
I noticed Moosejaw was adding quite a bit of content to the bottom of category pages via a div tag that makes use of a scroll bar. Could a site be penalized by Google for this technique? Example: http://www.moosejaw.com/moosejaw/shop/search_Patagonia-Clothing____
White Hat / Black Hat SEO | | BrandLabs0