Has anyone seen this kind of google cache spam before?
-
Has anyone seen this kind of 'hack'?
When looking at a site recently I found the Google cache version (from 28 Oct) strewn with mentions of all sorts of dodgy looking pharma products but the site itself looked fine.
The site itself is www.istc.org.uk
Looking in the source of the pages you can see the home pages contains:
Browsing as googlebot showed me an empty page (though msnbot etc. returned a 'normal' non-pharma page).
As a mildly amusing aside - when I tried to tell the istc about this, the person answering the phone clearly didn't believe me and couldn't get me off the line fast enough! Needless to say they haven't fixed it a week after being told.
-
It's only a guess, but anyone who can cloak to set a webpage to appear differently for the Googlebot user agent can also restrict the cloaking to a range of IP addresses which Google normally uses. It would be sloppy not to do such because everyone has the ability to easily detect the cloaking otherwise, and the bad guys are trying to make that difficult.
-
Hi Ryan, I tried the mozbar and seobrowser
-
What tool did you use?
The problem is the tool likely spoofed the User Agent name, but not the IP address, so if the cloaking was well designed, it would present your test tool with the normal webpage while still presenting Googlebot with the hacked page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Url suddenlly diappeared from Google search results
Hi, I am facing a big problem wheel Google stop showing a basic url of my site, It was ranked good for more than 35 keywords from 1st to 8st positions, and suddenly I can find it indexed in Google , this is the URL : http://tv1.alarab.com/view-8/مسلسلات-عربية Thnaks
White Hat / Black Hat SEO | | alarab.net0 -
Why homepage is not getting cached by Google ?
It has been more than 2-3 months that I didn't notice that our website homepage is not getting cached by Google ?? i don't know why?? help me please, thanks in advance. Regards,
White Hat / Black Hat SEO | | spellblaster
Spel why.PNG0 -
Reducing Spam Flags
I have a personal site that is pretty strong (35 DA) likely from the fact that I have had it for so damn long (like. . . 15 years). So, since it's my own site of course I used it to link to my business site. Sadly, it looks like that site has "five spam flags" . . . which is ironic since it's totally legit. Anyway, what can I do to reduce spam flags? It mentions "low trust", small proportion of branded links, and "large site with few links". Pretty sure it's not the latter, since the site is just a wordpress site I use to share some of my music. (www.damonsongs.net in case anyone wants to hear it . . . ). So. . . 1) can I do much of anything to legitimze the site and 2) am I better off removing the site link I am trying to promote? Insights welcome!
White Hat / Black Hat SEO | | damon12120 -
How to ignore spam links to page?
Hey Moz pals, So for some reason someone is building thousands of links to my websites (all spam), likely someone doing negative seo on my site. Anyway, all these links are pointing to 1 sub url on my domain. That url didn't have anything on it so I deleted the page so now it comes up with a 404. Is there a way to reject any link that ever gets built to that old page? I don't want all this spam to hurt my website. What do you suggest?
White Hat / Black Hat SEO | | WongNs0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Can someone hep me on writing a letter to google reconsideration
Can someone hep me and charge for it on writing a letter to google reconsideration in native english? I did everything on this last year to respect google guidelines, and i as site owner always did on generating good content and usability to my site althoung a seo company had generate unnutural links that i manage to remove almost of them. i did all and more contacting webmasters to solve it. But i am not a english native speaker, i am just a content generator with a webmaster account. I is my last try to save my domain. Can someone help me on write a good google last letter? thanks
White Hat / Black Hat SEO | | maestrosonrisas0 -
Google Preferred Agency???
I just stumbled upon an SEO company's website that says they are a 'Google Preferred Agency'. This isn't just a line of copy on the site, it's featured prominently on the site, and they use the Google logo as well. I've never heard of a 'Google Preferred Agency'. One would think that even if there was such a thing, that it would involve a link back to a profile page on Google like they do with AdWords/Analytics partners... Am I missing something, or is this company doing something a little shady? I don't want to toss the name of the company out there because I don't want to publicly bash them.
White Hat / Black Hat SEO | | stevefidelity0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0