Help! Is this what is called "cloaking"?
-
Friend asked me to look at her website. Ran it through screaming frog and BAM, instead of 4 pages i was expecting it returned HUNDREDS. 99.9% of them are for cheap viagra and pharmaceuticals. I asked her if she was selling viagra, which is fine, I don't judge. But she swears she isn't.
I ran it through google site:janeflahertyesq.com and sure enough, if you click on some of those, they take you to canadien pharmacys selling half priced blue pills.
a) is this cloaking? if not, what is going on?
b) more importantly, how do I we get rid of those hundreds of pages / de-indexed
She's stumped and scared. Any help would be greatly appreciated. Thank you all in advance and for the work you do.
-
Check with your hosting company as well. Asking Google to deindex the pages will be a temporary fix if the files still physically reside on your website. A reputable hosting company will secure their servers so content like this is less likely to be added.
-
You are welcome
-
Thanks, we'll try this.
-
The main reason for that, is probably your site has been hacked and it is used as zombie website. Im pretty sure that problem is the theme or some nulled plugin this kind of problem is very usual when some one get a theme of an untrusted website. So make a backup of your site, then delete everything, add a fresh version of your theme and plugins.
Once your site is clean follow these tips
To secure your website
https://wordpress.org/plugins/all-in-one-wp-security-and-firewall/Another security tips
https://yoast.com/wordpress-security/ -
Sorry my mistake I misunderstood your question
- what is happening/how did those urls get added to my friends domain?
probably the site was hacked, in my case I had some issues like this one,
caused for malicious scripts,on non official themes or plugins.- how do we get rid of them?
Go to your Search Console > Google Index > Remove URLs -
Thank you for that response and I totally understand and agree w it. However it didn't seem to answer my questions, namely
-
what is happening/how did those urls get added to my friends domain?
-
how do we get rid of them?
-
-
Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.
Google has released the Disavow Tool to help webmasters deal with this problem, but the tool should be used with caution and only as a last resort.
To report spam websites here
A rich snippet may be considered spam if it harms the user experience by highlighting falsified or misleading information. For example, a rich snippet promoting a travel package as an Event or displaying fabricated Reviews would be considered spam.
Report spam in rich snippets here
Check out Matt Cutts’s answer VIDEO It usually takes 2-4 weeks for the tool to work. Can you afford to have your website penalized for 1 month? No one can! I will show you how you can prevent these attacks and keep your business safe.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
SERPs Help
Hey Mozzers, Please can someone advise? I manage the on-line content for an estate of Gyms in the UK. We had an existing gym location in Birmingham - www.nuffieldhealth.com/gyms/birmingham and 5 months ago we opened a new location in Birmingham - www.nuffieldhealth.com/gyms/birmingham-central. The 2 pages have different in-page content, different H1's, different title tags, different citations in page both have a few back links from different root domains, however the 2nd page (birmingham-central) does not rank in the top 50 results even though our domain is strong that the vast majority of results? Our original page (/gyms/birmingham) also slipped from page 1 in SERPs to the bottom of page 2 when the second Birmingham gym page was deployed?? I am guessing Google does not know which page to serve in SERPs, bud i am at a lose as to how to fix this issue. Can anyone please advise?? Regards Ben
White Hat / Black Hat SEO | | Bendall0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Do pingbacks in Wordpress help or harm SEO? Or neither?
Hey everyone, Just wondering, do pingbacks in Wordpress help or harm SEO? Or neither?
White Hat / Black Hat SEO | | jhinchcliffe1 -
Dramatic fall in SERP's for all keywords at end of March 2012?? Help!
Hi, Our website www.photoworld.co.uk has been improving it's SERP's for the last 12 months or so, achieving page 1 rankings for most of our key terms. Then suddenly, around the end of March, we suffered massive drops in nearly all of our key terms (see attached image for more info). Basically I wondered if anyone had any clues on what Google has suddenly taken a huge dislike to with our site and steps we can put in place to aid with rankings recovery ASAP. Thanks n8taO.jpg
White Hat / Black Hat SEO | | cewe0 -
Does SEOMOZ provide any help regarding to Link Buildiing and directory submission?
Hi Everybody, I am trying to work out how off-site SEO works and I am facing some troubles when it comes to link building. Does SEOMOZ provide any solution to this? Regards, Guido.
White Hat / Black Hat SEO | | SilbertAd0 -
Disqus integration and cloaking
Hey everyone, I have a fairly specific question on cloaking and whether our integration with disqus might be viewed as cloaking. Here is the setup. We have a site that runs off of drupal and would like to convert the comment handling to disqus for ease of our users. However, when javasrcript is disabled the nice comment system and all of the comments from disqus disappear. This obviously isn't good for SEO, however the user experience using disqus is way better than the native comment system. So here is how we are addressing the problem. With drupal we can sync comments between the native comment system and disqus. When a user has javascript enabled the containing div for the native comment system is set to display:none. hiding the submission form and all of the content and instead displaying it through the disqus interface. However when javascrip is not enabled the native comment form and the comments will be available to the user. Could this be considered cloaking by google? I know they do not like hidden div's, but it should be almost exactly the same content being displayed to the user (depending on when the last sync was run). Thanks for your thoughts, and if anyone has familiarity with a better way to integrate drupal and disqus I am all ears. Josh
White Hat / Black Hat SEO | | prima-2535090 -
Is this Cloaking?
http://www.shopstyle.com/product/sephora-makeup-sephora-collection-glossy-gloss/233883264 This comparison shopping engine url shows googlebot something dramatically different than My frustration is that a comp shop takes retailers content and copies and duplicates it and then uses it to capture traffic and send sales to other retailers other than the original provider of the content. Although this is a javascript function and not explicit bot detection does this qualify as unethical cloaking?
White Hat / Black Hat SEO | | tjgill990