Open Site Explorer - Top Pages that don't exist / result of a hack(?)
-
Hi all,
Last year, a website I monitor, got hacked, or infected with malware, I’m not sure which.
The result that I got to see is 100’s of ‘not found’ entries in Google Search Console / Crawl Errors for non-existent pages relating to / variations of ‘Canada Goose’. And also, there's a couple of such links showing up in SERPs. Here’s an example of the page URLs:
ourdomain.com/canadagoose.php ourdomain.com/replicacanadagoose.php
I looked for advice on the webmaster forums, and was recommended to just keep marking them as ‘fixed’ in the console. Sooner or later they’ll disappear. Still, a year after, they appear.
I’ve just signed up for a Moz trail and, in Open Site Explorer->Top Pages, the top 2-5 pages are relating to these non-existent pages: URLs that are the result of this ‘canada goose’ spam attack. The non-existent pages each have around 10 Linking Root Domains, with around 50 Inbound Links.
My question is: Is there a more direct action I should take here? For example, informing Google of the offending domains with these backlinks.
Any thoughts appreciated! Many thanks
-
Hi Mª Verónica B
That's great, Many thanks for the confirmation.
All the best,
Colin
-
Hi Colin,
If the backlinks/inbound links are spam, yes upload a disavow file, only related to those.
If multiple ghost pages in WordPress due to erased hacked pages, yes the new hidden page with all the above instructions, only related to the spam pages.
All the best,Mª Verónica B.
-
Thanks again Mª Veronica for taking the time to respond.
Ok, if i understand correctly, as those spam / 'canadagoose' related backlinks do indeed exist* , a disavow file for google would be the thing to do here?
There was indeed a hacking, which happened before i came along, which is reported in Google Search Console. And there are 100's of 'canadagoose' related crawl errors with a response code of 404 that just keep coming back. It looks like those pages did indeed once exist, and must have been deleted by the website developers. So the 'empty page' technique would apply here?
*It seems to me that the 'canadagoose' pages that have apparently since been deleted , and the backlinks linking to those 'ghost' pages, are all part of the hack:
- hack website, create 'canadagoose' pages
- link to 'canadagoose' pages from other websites
Many thanks,
Colin
-
Hi Colin,
Not exactly!
We are not talking about backlinks. Backlinks come from other websites, therefore we cannot control them, except upload a disavow file for Google.
That is quite different.We are talking about the hundred of "ghosts" of deleted pages - we deleted them, because our website was hacked.
At the time we deleted all those, that is not enough.
Crawlers will "see" 330 or more pages with 404 status!
That is awful for SEO, due to the crawlers/Google "understands" that you do not care about user experience, means you have so many erased pages that if somebody goes there, there will be nothing.1.- Moz Top Pages to find out all the spam pages and then Google, of course to be sure.
2.- A new page, not completely empty. Should say something like
"We are truly sorry... Thanks."
This is for the crawlers, it is supposed no human knows about those spam pages. Except the one that hacked your website.
3.- Redirect all spam pages in the list, with a 301 - a permanent redirect to the noindex/nofollow page you just created.
4:- Verify, copy and past from the list into the navigator and check if goes to the new page, also verify the page status with Moz bar.Thanks. Good luck,
Mª Verónica
-
Many thanks for your response Mª Veronica B, very helpful.
I've never used the disavow backlinks tool in Google Search Console. I would have assumed this is the ideal scenario to use it to disavow _specific _backlinks (not _all _backlinks). But instead what you're suggesting is:
-
create an empty / hidden (WordPress) Page, and make it noindex / nofollow
-
Get a list of all spam backlinks from Google Search Console
-
Redirect all spam backlinks in the list to the empty noindex / nofollow page
This would never have occurred to me, I'm going to do this right now.
Again, many thanks!
Colin
-
-
Hi,
It seems that the website has a similar situation as the one that I shared before.
Although, I had to take immediate action due to it was creating a very serious problem by sending malicious signals to all the crawlers.
Also, I discovered the issue by using the same Moz feature.
Thanks Moz!https://moz.com/community/q/more-than-450-pages-created-by-a-hacker
In my experience, by sending all those pages to a new hidden page, using a 301 and the noindex and nofollow directives. It is, somehow, sending the right signals to the crawlers of Google and the other search engines.
Let's say strongly informing all the crawlers, that those spam pages/404 are not relevant nor interesting for your website.
Andy's response agrees that is the best solution. Also, he recommends the Wordfence plugin for WordPress as a preventive measure to avoid further issues.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page / Domain Authority Question
If my website were to purchase a sponsored article on a site with a powerful domain authority that contained a do-follow link, and the link would be "domain.com/articles/new-article" ... obviously new-article would have 0 page authority, being new... is that still considered a valuable link and why or why not?
Intermediate & Advanced SEO | | cat5com0 -
What Happens If a Hreflang Sitemap Doesn't Include Every Language for Missing Translated Pages?
As we are building a hreflang sitemap for a client, we are correctly implementing the tag across 5 different languages including English. However, the News and Events section was never translated into any of the other four languages. There are also a few pages that were translated into some but not all of the 4 languages. Is it good practice to still list out the individual non-translated pages like on a regular sitemap without a hreflang tag? Should the hreflang sitemap include the hreflang tag with pages that are missing a few language translations (when one or two language translations may be missing)? We are uncertain if this inconsistency would create a problem and we would like some feedback before pushing the hreflang sitemap live.
Intermediate & Advanced SEO | | kchandler0 -
Google penalized site--307/302 redirect to new site-- Via intermediate link—New Site Ranking Gone..?
Hi, I have a site that google had placed a manual link penalty on, let’s call this our
Intermediate & Advanced SEO | | Robdob2013
company site. We tried and tried to get the penalty removed, and finally gave up and purchased another name. It was our understanding that we could safely use either a 302 or 307 temporary redirect in order to redirect people from our old domain to our new one.. We put this into place several months and everything seemed to be going along well. Several days ago I noticed that our root domain name had dropped for our selected keyword from position 9 to position 65. Upon looking into our GWT under “Links to Your site” , I have found many, many, many links which were pointed to our old google penalized domain name to our new root domain name each of this links had a sub heading “Via this intermediate link -> Our Old Domain Google Penalized Domain Name” In light of all of this going on, I have removed the 307/302 redirect, have brought the
old penalized site back which now consists of a basic “we’ve moved page” which is linked to our new site using a rel=’nofollow’ I am hoping that -1- Our new domain has probably not received a manual penalty and is most likely now
received some sort of algorithmic penalty, and that as these “intermediate links” will soon disappear because I’m no longer doing the 302/307 from the old sight to the new. Do you think this is the case now or that I now have a new manual penalty place on the new
domain name.. I would very much appreciate any comments and/or suggestions as to what I should or can do to get this fixed. I need to still keep the old domain name as this address has already been printed on business cards many, many years ago.. Also on a side note some of the sub pages of the new root domain are still ranking very
well, it’s only the root domain that is now racking awfully.. Thanks,0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Why have I disappeared from 1st page to outside top 100 results?
I have been #2 in Google for "plenty more fish" for about 8 years and have now disappeared. I've been told that I have no manual spam action against my domain www.plentymorefish.net. Can anyone suggest why I have disappeared and how I might rectify this? My guess is something to do with the Panda update Sept 27th - EMD (Exact Match Domain), although I don't understand why Google have left a competitors site #1 as this also has exact match domain. Maybe they've just picked one exact match domain to display and have not chosen mine as the .com has a better overall SEO score. Appreciate any assitance. Thanks.
Intermediate & Advanced SEO | | benners0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1 -
Are links to on-page content crawled / have any effect on page rank?
Lets say I have a really long article that begins with links to <a name="something">anchors on the same page.</a> <a name="something"></a> <a name="something">E.g.,</a> Chapter 1, Chapter 2, etc, allowing the user to scroll down to different content. There are also other links on this page that link to other pages. A few questions: Googlebot arrives on the page. Does it crawl links that point to anchors on the same page? When link juice is divided among all the links on the page, do these links count and page rank is then lost? Thanks!
Intermediate & Advanced SEO | | anthematic0 -
Posing QU's on Google Variables "aclk", "gclid" "cd", "/aclk" "/search", "/url" etc
I've been doing a bit of stats research prompted by read the recent ranking blog http://www.seomoz.org/blog/gettings-rankings-into-ga-using-custom-variables There are a few things that have come up in my research that I'd like to clear up. The below analysis has been done on my "conversions". 1/. What does "/aclk" mean in the Referrer URL? I have noticed a strong correlation between this and "gclid" in the landing page variable. Does it mean "ad click" ?? Although they seem to "closely" correlate they don't exactly, so when I have /aclk in the referrer Url MOSTLY I have gclid in the landing page URL. BUT not always, and the same applies vice versa. It's pretty vital that I know what is the best way to monitor adwords PPC, so what is the best variable to go on? - Currently I am using "gclid", but I have about 25% extra referral URL's with /aclk in that dont have "gclid" in - so am I underestimating my number of PPC conversions? 2/. The use of the variable "cd" is great, but it is not always present. I have noticed that 99% of my google "Referrer URL's" either start with:
Intermediate & Advanced SEO | | James77
/aclk - No cd value
/search - No cd value
/url - Always contains the cd variable. What do I make of this?? Thanks for the help in advance!0