Google Bombing For A Specific URL
-
The term "Beruk" which means "Ape or Monkey" in english brings up this page of wikipedia amongst the first page result: URL: http://en.wikipedia.org/wiki/Khairy_Jamaluddin
The page does not contain the word "Beruk".
External links to the page do not contact the anchor-text "Beruk"
Given the above scenario, how is the page still ranking on first page for this keyword?
-
Hi Dunamis,
I was wondering about the same. If Google sees historical search queries using "Term A" + "Term B" would it associate both the words so strongly.. that even when only one of the search term is being used the others relevance still gets quantifies? Hard to believe Google could be going something like this.
Still an open question for now. Let's see if we get any more explanations.
-
Thanks Ryan for the historical wiki dig. But I doubt Google would have something so old still influence todays results (especially when the page was edited a long time ago removing all traces of the word "Beruk")
However this could be one of the possible explanations.
-
This could also have something to do with how Google determines relevance. If a user types in "black cat", sees their search results and then immediately goes back and types in "black kitten", Google can determine that cat and kitten are relevant. If enough people do it they will figure out that when someone types cat, they could mean kitten. The algorithm is more complicated than that though and Google is always learning.
So, I would guess that Google has figured out that when someone searches for Beruk, that that word is really relevant to the word monkey. And then, the Wikipedia page is very relevant to monkies, especially the type of monkeys that people are looking for when they type in Beruk.
-
If you google phases with N.I. in them google shows results with Northern Ireland in the serps (bolded and all), maybe google doing something similar here?
-
Wikipedia is such an incredible strong site that Google clearly places them on a pedastal. This is purely a case of domain rank. To learn why the term Beruk is associated with that page you need to look at the page's history. In July 2008, about 150 page edits ago, a wiki reader decided to edit the page and use the term "beruk" as an insult. That is how the term became associated with the page. http://en.wikipedia.org/w/index.php?title=Khairy_Jamaluddin&oldid=226010042 This page would be a good example for the Google team to examine and then adjust their metrics.
-
Apparently, in 2007 Jamaluddin was involved in some kind of controversy concerning an HIV-positive monkey (http://ms.wikipedia.org/wiki/Khairy_Jamaluddin#Isu_beruk, I used Google Translate but it's not very clear).
Possibly a lot of pages just link to his wiki article using the work Beruk as past of the anchor text, or maybe even just as words surrounding the anchor text
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL masking
Hi all, Within my organization we maintain a bunch of websites, say child1.com, child2.com and child3.com. It was recently suggested that we merge all three into a single one, say mother.com. The marketing community within my organization strongly argued against this, mainly due to risks related to brand awareness, rank and traffic loss for each of these websites. But we resigned to the idea that we must do it, so we started considering the next proper steps. Now, my understanding is that setting up redirects is crucial in order to cushion the fall and mitigate the losses. However, some people are suggesting url masking. To me personally, url masking doesn't sound like a white hat practice, maybe it's borderline grey, but the bottom line is I need some advice on this topic. Could someone kindly address the following: 1. How is url masking different from url redirect? 2. Is url masking different from url cloaking? 3. Would google penalize us for implementing url masking? 4. Would that have any impact on our PPC campaigns? 5. Are there any documented cases of successful and google-sanctioned websites that are actively using url masking? 6. Are there any pitfalls to using this strategy? Thank you
White Hat / Black Hat SEO | | SimonaCretu0 -
Any more info on potential Google algo update from April 24th/25th?
Apart from an article on Search Engine Roundtable, I haven’t been able to find anything out about the potential algorithm update that happened on Monday / Tuesday of this week. One of our sites (finance niche) saw drops in rankings for bad credit terms on Tuesday, followed by total collapse on Wednesday and Thursday. We had made some changes the previous week to the bad credit section of the site, but the curious thing here is that rankings for bad credit terms all over the site (not just the changed section) disappeared. Has anyone else seen the impact of this change, and are there any working theories on what caused it? I’m even wondering whether a specific change has been made for bad credit terms (i.e. the payday loan update)?
White Hat / Black Hat SEO | | thatkinson0 -
Traffic exchange referral URL's
We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?
White Hat / Black Hat SEO | | Teamzig0 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
How long we can keep 302 redirection for a webpage url?
Hi Friends, I have a webpage featuring a product. I have created a new domain featuring the same product and the page is under construction. I am planning to do 302 redirection from the new domain to the existing domain for the time being. How long can I keep the 302 redirection from the new domain to existing domain? Is there any fixed time period/ duration that we can keep the 302 redirection for a webpage? I am planning to make few more pages (privacy policy, about us, etc) from the new domain 302 redirected to the existing domain. Is it possible? If so, how long can I keep the same? May I know which redirect is safe to use in this case, 302 or 301 redirect?
White Hat / Black Hat SEO | | zco_seo0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
It Shows as "google results" but it's an incoming links, is it spaming me...?
Hello everyone I have 2 issues to share: 1) We have a site (personal-loans.org), In the past few weeks we notice that there are sites that have links to our site and we get traffic from them but...! when you go online to these sites they show you that all they do is provide "google search" results, because we where in first page on the results we had hits there as well what leads me to think that this is the reason we are at page 7 now after yesterday the ranking was at page 4. these are some of these sites so you can see it: internetpayadvances.com fastlivecashadvance.com assistancemoney.com scoutcashnow.com officialpayday.net Does anyone else got to see anything like that...??? I have many more links like that, these are only 5 out of 9 that had hits yesterday only, site traffic went from 250-300 to 63 a day... For the same site - it was on google search results 1st page and ranked 4-7, even after the big penguin changes. What we did notice is that A LOT of non related sites like surfing (yes ocean surfing) and sites that had no content AT ALL - all the text was inside of an image and ranked 3! 3rd on payday loans search result. (and the rest was and still just looks the same with different content...) Google say they want quality but does not do homework for the 2nd largest search for keywords such as loans and payday loans market, same goes for the cash advance. Please help, need your advice.... Thanks
White Hat / Black Hat SEO | | Yonnir0