Victim of Negative SEO - Can I Redirect the Attacked Page to an External Site?
-
My site has been a victim of Negative SEO. During the course of 3 weeks, I have received over 3000 new backlinks from 200 referring domains (based on Ahref report). All links are pointing to just 1 page (all other pages within the site are unaffected). I have already disavowed as many links as possible from Ahref report, but is that all I can do? What if I continue to receive bad backlinks?
I'm thinking of permanently redirecting the affected page to an external website (a dummy site), and hope that all the juice from the bad backlinks will be transferred to that site. Do you think this would be a good practice? I don't care much about keeping the affected page on my site, but I want to make sure the bad backlinks don't affect the entire site.
The bad backlinks started to come in around 3 weeks ago and the rankings haven't been affected yet. The backlinks are targeting one single keyword and are mostly comment backlinks and trackbacks.
Would appreciate any suggestions
Howard
-
First, don't freak out. What does the anchor text look like? Is it for a term you're trying to rank for on that page? Chances are actually pretty low that it's going to hurt you. Google has a few intent- and source-detection mechanisms built in that work relatively well.
If this is a high-value page that you're making a lot of money on or that is ranking well, don't move it and don't 410 or 404 it. It's Google's job to filter through spam and spam attacks, and they do an OK job. I don't think it's totally wrong to disavow the links, but my experience is that people generally over-react.
http://www.seroundtable.com/google-bad-links-disavow-17195.html
TL;DR this is all good advice, but don't drop or redirect a high-value page.
-
Thanks for all the responses!
-
410 / GONE
“Indicates that the resource requested is no longer available and will not be available again. This should be used when a resource has been intentionally removed and the resource should be purged. Upon receiving a 410 status code, the client should not request the resource again in the future. Clients such as search engines should remove the resource from their indices. Most use cases do not require clients and search engines to purge the resource, and a "404 Not Found" may be used instead.“ — wikipedia
“The requested resource is no longer available at the server and no forwarding address is known. This condition is expected to be considered permanent. Clients with link editing capabilities SHOULD delete references to the Request-URI after user approval. If the server does not know, or has no facility to determine, whether or not the condition is permanent, the status code 404 (Not Found) SHOULD be used instead. This response is cacheable unless indicated otherwise.
The 410 response is primarily intended to assist the task of web maintenance by notifying the recipient that the resource is intentionally unavailable and that the server owners desire that remote links to that resource be removed. Such an event is common for limited-time, promotional services and for resources belonging to individuals no longer working at the server's site. It is not necessary to mark all permanently unavailable resources as "gone" or to keep the mark for any length of time -- that is left to the discretion of the server owner.“ — ietf410 / CODE REFERENCE(S)Rails HTTP Status Symbol :gonehttp://httpstatus.es/410
-
A few options:
1. As david said make the page a 401 page.
2. Try to remove the links on scale, review why they are comming in i.e same IP address, same who is, request sites to remove them, if they don't remove add them to the disavow.
I wouldn't 301 pages this will just transfer the problem to a new websites, ive seen numerous cases where domains have been hit because of cross site 301's.
-
Return a 410 http status (page permanently gone, disregard links) on that URL, move the content to a new URL.
-
Are you positive that it wasn't anything you bought as a service, right?
Although Google's Matt Cutts claims that Negative SEO exists but it would take a lot of work to achieve and you could actually benefit the target instead, it has been proven over and over that it isn't that hard, see here: http://www.fulltraffic.net/blog/85062/is-negative-seo-becoming-a-mainstream-tactic-infographic/
As it is something you actually can't control, I would just go with trying to contact the owners of those pages where the links are and ask them politely to remove the link, as it will also help them too (usually the most affected side is the one selling the links, as there's no way to know who is buying them). Don't only go with an email, try social networks too, contact forms, etc.
But, considering that your rankings aren't affected, after contacting those Webmasters you shouldn't go as far as disavowing the links, you are not being penalized, you did the job on trying to remove the links (document your efforts!!), etc. IF, and only if you notice a ranking drop, an actual penalty, you should go ahead and disavow those links, and in case of a penalty, send a reconsideration request explaining them everything an showing the efforts you did to get rid of those links.
As Cutts told: it may actually benefit you...
Hope that helps!
-
Take the page the bad links are being sent to copy the content get rid of the old page make a new URL put your old pages content on a new URL. The 301 will hurt you.
If you want to try and find the person sending you the links use removeem.com
I hope I was of help,
thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirects for 3 top level domains using WP SEO Yoast
Hey Guys I have a custom built website - and a wp blog attached to this - problem is there are 3 top level domains: zenory.co.nz, zenory.com and zenory.com.au **The issue is when I enter the domain to 301 redirect I only have to enter one domain usually i enter redirect from zenory.com/blog/oldpage to zenory.com.newpage ** For eg: I have just move Phone Psychic Readings from the blog - over to the main site. However there seems to be an issue that I'm still having and trying to clean up. I'm finding backlinks there are linking to each other of my 3 domains that end up backlinking across domains, which I was told this can look as spammy to google. For eg: co.nz links many pages to com.au. I'm currently trying to clean this up at the moment - however while im in the process of this - I find myself question when I'm creating the 301 redirects from the blog - but lets say I'm on the blog for zenoy.co.nz/blog/oldblogpost and when I click on a blog post - it redirects me to zenory.com/newarticlepost - because I have redirected it to .com - how can I redirect and make sure is going back to the right domain name to save myself from having to show this cross backlinks? Would gratefully appreciate any assistance on this tricky situation. Cheers Just
White Hat / Black Hat SEO | | edward-may0 -
Rank drop ecommerce site
Hello, We're going to get an audit, but I would like to hear some ideas on what could cause our ranking drop. There's no warnings in GWT. We deleted 17 or so blogs (that had no backlinks pointing to these blogs and were simply for easy links) last summer thinking that they weren't white hat so we had to start eliminating them. At the same time, we eliminated a few sitewide paid links that were really strong. With all of this deletion, our keywords started to drop. For example, our main keyword went from first to third/fourth. With the deletions, our keywords dropped immediately a couple of spots, then with no more deletions, all of our keywords have been slowly dropping over the last seven months or so. Right now we are at the bottom of the first page for that same main keyword, and other keywords look similar. We have 70 linking root domains, of which: 15 are blogs with no backlinks that were created simply for the purpose of easy links. We didn't delete them all yet because of the immediate ranking drop when we deleted the last ones. One PR5 site has links to our home page scattered throughout it's lists of resources for people in different states in the US. It doesn't look like a standard paid link site, but it has many paid links in it's different pages. One PR4 site has our logo with another paid link logo at the bottom of one of it's pages. There are 2 other paid links from two PR4 sites that look editorial. There are other links on the sites to other websites that are paid. All links for these 2 sites look editorial. That's all the bad stuff. Other things that could be causing drop in rank - > Our bread crumbs are kind of messed up. We have a lot of subcategory pages that rel=cononical to main categories in the menu. We did this because we had categories that were exactly the same. So you'll drill down on a category page and you'll end up on a main category. To the average user, it seems perfectly fine. Our on-site SEO still has a few pages that repeat words in the titles and h1 tags several times (especially our #1 main keyword), titles similar to something like: running shoes | walking shoes | cross-training shoes where a word is repeated 2 or 3 times. Also, there are a few pages that are more keyword stuffed than we would like in the content. Just a couple of paragraphs but 2 keywords are dispersed in them three times each. The keywords in this content is not in different variations, it's exactly the keyword. We've still got a few URLs that are keywords stuffed with like 3 different keywords. We may have many 404 errors (due to some mistakes we made with the URLs in our cart) - if Google hasn't deindexed them all then we could have dozens of 404s on important category pages. But nothing is showing up in GWT. Our sitemap does not include any broken links. Google is confused about our branding it seems. I'm adding branding to the on-site SEO but right now Google often shows keywords as our branding when Google changes the way the title tag is displayed sometimes in the search engines. We don't link out to anyone. We have lots of content, almost no duplicate content, and some authoritative very comprehensive articles. Your thoughts on what to do to get our rankings back up?
White Hat / Black Hat SEO | | BobGW0 -
Article pages not ranking as well as they should
Hello, Our articles here are not ranking as strongly as they should. Could you take a look and tell me why? When I search for the exact article title we do not come up. We used to. Note our sitewide footer links to some articles in case that's the problem, but even articles not in the footer links aren't performing.
White Hat / Black Hat SEO | | BobGW0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
How Does This Site Get Away With It?
The following site is huge in the movie trailer industry: http://bit.ly/18B6tF It ranks #3 in Google for "Movie Trailers" and has high rankings for multiple other major keywords in the industry. Here's the thing; virtually all of their movie trailer pages contain copy/pasted content from other sites. The movie trailer descriptions are the ones given by the movie companies and therefor the same content is on thousands of websites/blogs. We all know Google hates duplicate content at the moment... so how does this site get a away with it? Does it's root-domain authority keep it up there?
White Hat / Black Hat SEO | | superlordme0 -
How Can I Check Competitors Linking Profile?
If I'm looking for weak points in my competitors linking structure, how can I use Open Site Explorer to do that? In other words, I'm not sure how to use Open Site Explorer? Zane
White Hat / Black Hat SEO | | Springboks0 -
Shadow Page for Flash Experience
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0 -
301 Redirect ASP code
Hi I have a script detailed below, that 301 redirects based upon different queries --- """"<%if (Request("offset") = "") Then%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") <> "" then'Sector and Location NOT NULL%> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") = "" then'Sector NOT NULL and Location NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") = "" AND Request("j") <> "" then'Sector NULL and Location NOT NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <%End if%>"""" But this still allows for both the www and non www versions of these pages to render in the browser, which is resulting in duplicate content. On my home page I use -- <% If InStr(Request.ServerVariables("SERVER_NAME"),"www") = 0 Then Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www." & Request.ServerVariables("HTTP_HOST") & "/" Response.End End if %> `Is there a good way to combine these, so that I still get all of the rules of the first script whilst also redirecting any non www versions to the www version? in other words
White Hat / Black Hat SEO | | TwoPints
domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation")))
Eould redirect to
www.domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Thanks in advance`0