Open Site Explorer - Top Pages that don't exist / result of a hack(?)
-
Hi all,
Last year, a website I monitor, got hacked, or infected with malware, I’m not sure which.
The result that I got to see is 100’s of ‘not found’ entries in Google Search Console / Crawl Errors for non-existent pages relating to / variations of ‘Canada Goose’. And also, there's a couple of such links showing up in SERPs. Here’s an example of the page URLs:
ourdomain.com/canadagoose.php ourdomain.com/replicacanadagoose.php
I looked for advice on the webmaster forums, and was recommended to just keep marking them as ‘fixed’ in the console. Sooner or later they’ll disappear. Still, a year after, they appear.
I’ve just signed up for a Moz trail and, in Open Site Explorer->Top Pages, the top 2-5 pages are relating to these non-existent pages: URLs that are the result of this ‘canada goose’ spam attack. The non-existent pages each have around 10 Linking Root Domains, with around 50 Inbound Links.
My question is: Is there a more direct action I should take here? For example, informing Google of the offending domains with these backlinks.
Any thoughts appreciated! Many thanks
-
Hi Mª Verónica B
That's great, Many thanks for the confirmation.
All the best,
Colin
-
Hi Colin,
If the backlinks/inbound links are spam, yes upload a disavow file, only related to those.
If multiple ghost pages in WordPress due to erased hacked pages, yes the new hidden page with all the above instructions, only related to the spam pages.
All the best,Mª Verónica B.
-
Thanks again Mª Veronica for taking the time to respond.
Ok, if i understand correctly, as those spam / 'canadagoose' related backlinks do indeed exist* , a disavow file for google would be the thing to do here?
There was indeed a hacking, which happened before i came along, which is reported in Google Search Console. And there are 100's of 'canadagoose' related crawl errors with a response code of 404 that just keep coming back. It looks like those pages did indeed once exist, and must have been deleted by the website developers. So the 'empty page' technique would apply here?
*It seems to me that the 'canadagoose' pages that have apparently since been deleted , and the backlinks linking to those 'ghost' pages, are all part of the hack:
- hack website, create 'canadagoose' pages
- link to 'canadagoose' pages from other websites
Many thanks,
Colin
-
Hi Colin,
Not exactly!
We are not talking about backlinks. Backlinks come from other websites, therefore we cannot control them, except upload a disavow file for Google.
That is quite different.We are talking about the hundred of "ghosts" of deleted pages - we deleted them, because our website was hacked.
At the time we deleted all those, that is not enough.
Crawlers will "see" 330 or more pages with 404 status!
That is awful for SEO, due to the crawlers/Google "understands" that you do not care about user experience, means you have so many erased pages that if somebody goes there, there will be nothing.1.- Moz Top Pages to find out all the spam pages and then Google, of course to be sure.
2.- A new page, not completely empty. Should say something like
"We are truly sorry... Thanks."
This is for the crawlers, it is supposed no human knows about those spam pages. Except the one that hacked your website.
3.- Redirect all spam pages in the list, with a 301 - a permanent redirect to the noindex/nofollow page you just created.
4:- Verify, copy and past from the list into the navigator and check if goes to the new page, also verify the page status with Moz bar.Thanks. Good luck,
Mª Verónica
-
Many thanks for your response Mª Veronica B, very helpful.
I've never used the disavow backlinks tool in Google Search Console. I would have assumed this is the ideal scenario to use it to disavow _specific _backlinks (not _all _backlinks). But instead what you're suggesting is:
-
create an empty / hidden (WordPress) Page, and make it noindex / nofollow
-
Get a list of all spam backlinks from Google Search Console
-
Redirect all spam backlinks in the list to the empty noindex / nofollow page
This would never have occurred to me, I'm going to do this right now.
Again, many thanks!
Colin
-
-
Hi,
It seems that the website has a similar situation as the one that I shared before.
Although, I had to take immediate action due to it was creating a very serious problem by sending malicious signals to all the crawlers.
Also, I discovered the issue by using the same Moz feature.
Thanks Moz!https://moz.com/community/q/more-than-450-pages-created-by-a-hacker
In my experience, by sending all those pages to a new hidden page, using a 301 and the noindex and nofollow directives. It is, somehow, sending the right signals to the crawlers of Google and the other search engines.
Let's say strongly informing all the crawlers, that those spam pages/404 are not relevant nor interesting for your website.
Andy's response agrees that is the best solution. Also, he recommends the Wordfence plugin for WordPress as a preventive measure to avoid further issues.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site's pages has GA codes based on Tag Manager but in Screaming Frog, it is not recognized
Using Tag Assistant (Google Chrome add-on), we have found that the site's pages has GA codes. (also see screenshot 1) However, when we used Screaming Frog's filter feature -- Configuration > Custom > Search > Contain/Does Not Contain, (see screenshot 2) SF is displaying several URLs (maybe all) of the site under 'Does Not Contain' which means that in SF's crawl, the site's pages has no GA code. (see screenshot 3) What could be the problem why SF states that there is no GA code in the site's pages when in fact, there are codes based on Tag Assistant/Manager? Please give us steps/ways on how to fix this issue. Thanks! SgTovPf VQNOJMF RCtBibP
Intermediate & Advanced SEO | | jayoliverwright0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Interlinking vs. 'orphaning' mobile page versions in a dynamic serving scenario
Hi there, I'd love to get the Moz community's take on this. We are working on setting up dynamic serving for mobile versions of our pages. During the process of planning the mobile version of a page, we identified a type of navigational links that, while useful enough for desktop visitors, we feel would not be as useful to mobile visitors. We would like to remove these from our mobile version of the page as part of offering a more streamlined mobile page. So we feel that we're making a fine decision with user experience in mind. On any single page, the number of links removed in the mobile version would be relatively few. The question is: is there any danger in “orphaning” the mobile versions of certain pages because links don’t exist pointing to those pages on our mobile pages? Is this a legitimate concern, or is it enough that none of the desktop versions of pages are orphaned? We were not sure whether it’s even possible, in Googlebot’s eyes, to orphan a mobile version of a page if we use dynamic serving and if there are no orphaned desktop versions of our pages. (We also plan to link to "full site" in the footer.) Thank you in advance for your help,
Intermediate & Advanced SEO | | Eric_R
Eric0 -
Why my site it's not being indexed?
Hello.... I got to tell that I feel like a newbie (I am, but know I feel like it)... We were working with a client until january this year, they kept going on their own until september that they contacted us again... Someone on the team that handled things while we were gone, updated it´s robots.txt file to Disallow everything... for maybe 3 weeks before we were back in.... Additionally they were working on a different subdomain, the new version of the site and of course the didn't block the robots on that one. So now the whole site it's been duplicated, even it´s content, the exact same pages exist on the suddomain that was public the same time the other one was blocked. We came in changes the robots.txt file on both server, resend all the sitemaps, sent our URL on google+... everything the book says... but the site it´s not getting indexed. It's been 5 weeks now and no response what so ever. We were highly positioned on several important keywords and now it's gone. I now you guys can help, any advice will be highly appreciated. thanks Dan
Intermediate & Advanced SEO | | daniel.alvarez0 -
Two Pages with the Same Name Different URL's
I was hoping someone could give me some insight into a perplexing issue that I am having with my website. I run an 20K product ecommerce website and I am finding it necessary to have two pages for my content: 1 for content category pages about wigets one for shop pages for wigets 1st page would be .com/shop/wiget/ 2nd page would be .com/content/wiget/ The 1st page would be a catalogue of all the products with filters for the customer to narrow down wigets. So ultimately the URL for the shop page could look like this when the customer filters down... .com/shop/wiget/color/shape/ The second page would be content all about the Wigets. This would be types of wigets colors of wigets, how wigets are used, links to articles about wigets etc. Here are my questions. 1. Is it bad to have two pages about wigets on the site, one for shopping and one for information. The issue here is when I combine my content wiget with my shop wiget page, no one buys anything. But I want to be able to provide Google the best experience for rankings. What is the best approach for Google and the customer? 2. Should I rel canonical all of my .com/shop/wiget/ + .com/wiget/color/ etc. pages to the .com/content/wiget/ page? Or, Should I be canonicalizing all of my .com/shop/wiget/color/etc pages to .com/shop/wiget/ page? 3. Ranking issues. As it is right now, I rank #1 for wiget color. This page on my site would be .com/shop/wiget/color/ . If I rel canonicalize all of my pages to .com/content/wiget/ I am going to loose my rankings because all of my shop/wiget/xxx/xxx/ pages will then point to .com/content/wiget/ page. I am just finding with these massive ecommerce sites that there is WAY to much potential for duplicate content, not enough room to allow Google the ability to rank long tail phrases all the while making it completely complicated to offer people pages that promote buying. As I said before, when I combine my content + shop pages together into one page, my sales hit the floor (like 0 - 15 dollars a day), when i just make a shop page my sales are like (1k+ a day). But I have noticed that ever since Penguin and Panda my rankings have fallen from #1 across the board to #15 and lower for a lot of my phrase with the exception of the one mentioned above. This is why I want to make an information page about wigets and a shop page for people to buy wigets. Please advise if you would. Thanks so much for any insight you can give me!
Intermediate & Advanced SEO | | SKP0 -
Hide H1 tags on pages. Don't chuckle-Need assistance.
I redesigned my companies website and I am first and foremost an SEO person so I know the importance of a well laid out website. Furthermore, I know realistically you should NEVER hide text whether it's with WH or BH intentions but here is my problem. For every page I have all the details taken care of except proper placement of H1 tags. My website is responsive designed VERY competitive industry I have to make sure it is properly developed both design wise and seo wise It's an INC 5000 company so NO BH intentions On phones and tablet devices I have the header images hidden and in the place of header images I have the information as in location, service,etc of whatever that page may be. This makes it look good on desktops and serves up information quickly to people using phones and tablets. My question is: Would it be bad to turn that text seen on tablets and phones into an h1 tag as it's hidden on desktops with CSS but available on mobile devices. My problem is making the h1 tag's work with the desktop versions visually as placement doesn't make since. Any opinions are appreciated. Thanks Ballanrk
Intermediate & Advanced SEO | | ballanrk0 -
Site Search Tracking Of Non Existing Products
I am working towards optimizing the site search box of an ecommerce website and I wish to track the keywords which users are searching but which are yielding no results. Please see the image for the same. I wish to assimilate data on the same which would then allow me to add products which users are searching but which the site doesn't have. However my problem is that I don't know how you could obtain this data in analytics because these results manifest itself in the form of searchresults.php. I know that analyzing search refinements and percentage of exits in Google Analytics is an option but I want a more compact and simpler solution to the problem where I could see exactly all the data in one place. Does anyone have suggestions on how this can be done? Thanks in advance, Y35Mj.png
Intermediate & Advanced SEO | | pulseseo0 -
Can't find my site on Bing, since ages
Hi Guys, Well, the problem seems normal but I guess it's not. I have tried many things, and nothing changed it, now I give it last try... ask so maybe you will help me. The problem is.. I can't find my site nowhere in Bing, I mean nowhere by not in first 20 pages for my keywords "beauty tips" and the site is: http://www.beauty-tips.net/. In my opinion it should be pretty high... maybe it's too high so I can't see it ;). I never had special problems with Bing, was easier to be there "somewhere" than in google, but with this one is totally opposite. Any ideas? Thanks for your time!
Intermediate & Advanced SEO | | Luke220