Cutting off the bad link juice
-
Hello,
I have noticed that there is plenty of old low quality links linking to many of the landing pages. I would like to cut them off and start again. Would it be ok to do the following?:
1. create new URLs (domain is quite string and new pages are ranking good and better than the affected old landing pages) and add the old content there
2. 302 redirect old landing pages to the new ones
3. put "no index" tag on the old URLs (maybe even "no index no follow"?)or it wouldn't work?
Thanks in advance
-
Hello all,
Thank you for your answers,
Oleg, I am not that keen on meta refresh, as it is poor user experience - apparently it needs to be about 10 sec, as shorter time G. may treat as 301. Wonder what is the shortest time I can use which will lose the link juice but wouldn't disturb my visitors.
Gagan, in regards to 301 redirecting the bad page to 404 page..isn't that easier just to make it 404 without redirect?
Mike, what do you think is the best solution to keep the traffic but cut off bad links to specific landing pages.
I will be testing 302 soon from old URL to new one. Wonder if I ALSO should put 404 on the old one...or maybe no index...or it doesn't matter? What are your thoughts?
-
Does it seems perfectly okay to make the site page (linked by spam links) to have 301 redirect to show 404 error page
As if its a CMS system where many other pages are linked through other subcategories too of the component, so the option of cutting down the bad page, which is hurt by low quality links is through 301 redirect to land to 404 error page. Will it diminish or rather make completely off the value of all spam links pointing to it and finally does not affect the site at all.
-
Upon further research, you are correct. A noindexed page is still crawled and indexed, just not in SERPs. So any links will still be followed and the page is still a part of the website. With this in mind, I think you should 404 the page and redirect via meta refresh after some time. Reach out to the webmaster's of the good links and ask them to change the new URL.
I still don't think a 302 is the way to go in this scenario. Ideally, you'd experiment with different options and see which produces the best results.
-
Personally I would go with Oleg's original suggestion: "If your rankings are being hurt by these links, I would move them to a new URL and 404 the old page. I would then go through the link profile for the old URLs. Find all the high quality links and contact the webmasters asking to change it to the new URLs."
-
Sure, But Oleg said, "If you noindex the page, G won't be able to access it and it will lose all its authority".
If in case the page loses all its authority - does it still will pass on the negative value to the domain or other pages due to low authority or spam backlinks pointing to it
If its true, then may be making the page cut off from site by marking it 404 is a better way !!
-
NoIndex won't cut the links. It will just remove the page from the SERPs. So you'll still be hit with the bad links to your site and organic traffic will be cut off.
-
Sure, thanks
Does it mean if we noindex it - can it be safely presumed that all the low quality links pointing to that url will be nullified and it will not have any negative effect to the site. I mean there wont be any need for making the page 404, if we still use that page as regular part of the site, like for filling forms etc.
Many thanks, once again for your detailed reply
-
So his goal is the have users redirect to the new page without having Google pass the link authority to the new URL.
If you noindex the page, G won't be able to access it and it will lose all its authority. But any user that visits the page will still be redirected to the new url. There is no such thing as a 404 redirect.
Meta refresh is another way to redirect users to a new page without passing authority. As long as the time is greater than 0 (meta refresh of time=0 is treated similar to a 301), it shouldn't pass authority. So same deal, noindex the page and set up a redirect for users, not bots.
-
Hello Oleg,
Am also interested in knowing more about it
Does marking a noindex, follow or noindex, nofollow to that page is a better way than 404 redirect ?
Also, i dint get you for meta refresh redirect. What does it mean like ?
-
302 by definition is "Temporary Redirect", which is not applicable here. According to this 302 experiment, 302's did actually pass some authority down (which may or may not hurt you). I do see the UX advantage to having the old URL redirect to the new page though.
Another alternative is to block the page via robots and set up a redirect or noindex the page and set a timed meta refresh redirect to the new page.
-
Thank you Oleg,
I have checked and have a few .gov.uk links going to some of those pages which generates some traffic, so not sure if 404 on them is the suitable in the situation.
On the other hand why 404 is better than 302? They both stop link juice passing but 302 passes the traffic.
-
If your rankings are being hurt by these links, I would move them to a new URL and 404 the old page. I would then go through the link profile for the old URLs. Find all the high quality links and contact the webmasters asking to change it to the new URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Good vs Bad Web directories
Hi this blog post Rand mentions a list of bad web directories - I asked couple of years ago if there is an updated list as some of these (Alive Directory for example) do not seem to be blacklisted anymore and are coming up in Google searches etc? It seems due to old age of the blog post (7 years ago ) the comments are not responded to. Would anyone be able to advise if which of these good directories to use? https://moz.com/blog/what-makes-a-good-web-directory-and-why-google-penalized-dozens-of-bad-ones
White Hat / Black Hat SEO | | IsaCleanse0 -
Dealing with links to your domain that the previous owner set up
Hey everyone, I rebranded my company at the end of last year from a name that was fairly unique but sounded like I cleaned headstones instead of building websites. I opted for a name that I liked, it reflected my heritage - however it also seems to be quite common. Anyway, I registered the domain name as it was available as the previous owner's company had been wound up. It's only been in the last week or two where I've managed to have a website on that domain and I've been tracking it's progress through Moz, Google & Bing Webmaster tools. Both the webmaster tools are reporting back that my site triggers 404 errors for some specific links. However, I don't have or have never used those links before. I think the previous owner might have created the links before he went bust. My question is in two parts. The first part is how do I find out what websites are linking to me with these broken URL's, and the second is will these 404'ing links affect my SEO? Thanks!
White Hat / Black Hat SEO | | mickburkesnr0 -
Do dead/inactive links matter?
In cleaning up the backlink profile for my parent's website, I've come across quite a few dead links. For instance, the links in the comments here: http://www.islanddefjam.com/artist/news_single.aspx?nid=4726&artistID=7290 Do I need to worry about these links? I assume if the links are no longer active, and hence not showing up in webmaster or moz reports, I can probably ignore them, but I'm wondering if I should try and get them removed regardless? I've read that google is increasingly taking into account references (i.e. website mentions that are not links) and I don't know if inactive spam links might leave a bad impression of a website. Am I being overly paranoid? I imagine disavowing them would be pointless as you can't attach a nofollow tag to an inactive link.
White Hat / Black Hat SEO | | mgane0 -
Inbound Links Inquiry for a New Site
For a site that is only one to two months old, what is considered a natural amount of inbound links if you're site offers very valuable information, and you have done a marketing push to get the word out about your blog? Even if you are receiving backlinks from authority websites with high DA, does Google get suspicious if there are too many inbound links during the first few months of a sites existence? I know there are some sites that blow up very fast and receive thousands of backlinks very quickly, so I'm curious to know if Google puts these kind of sites on a watchlist or something of that nature. Or is this simply a good problem to have?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
What do you say in your emails to horrible sites to remove your links?
Morning guys, I've the unenviable task of having to rectify poor link building (a previous company's work, not mine) which inevitably means emailing tons and tons of horrible directories with links to the client from as far back as 5/6 years ago. I'm sure many of you are in the same boat so it begs the question: What have you said to these types of sites that is effective in getting them to remove the links? This could even be a two/three-parter: If you've had little joy in requesting removals, have you dis-avowed the links, and what (if any) effect did it have? Thanks, M.
White Hat / Black Hat SEO | | Martin_S0 -
Link worth?
These are not my links but does anyone know what the value of one link from something like below is (bio or body) http://designwebkit.com/web-and-trends/how-many-fonts-designer-really-need/ www.thebuildingblox.com/termite-turmoil-how-to-identify-and-remedy-the-problem/ http://creativeoverflow.net/the-10-best-alternatives-to-dropbox/ in comparison with links from below www.01fangchan.com
White Hat / Black Hat SEO | | BobAnderson
www.1.inerdentos.ru
www.1000empregos.com
www.1stdirectory.co.uk
www.2halsi.com
www.3dir.co.uk
www.514friends.com
www.57billion.com We disavowed around a 1000 links of the above quality (crap) and need to rebuild decent quality links and i would just like to know what the guess is on how many links such as below would need to be built to compensate for the loss. http://designwebkit.com/web-and-trends/how-many-fonts-designer-really-need/ vs www.01fangchan.com Would need to replace 1000.0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0