Cutting off the bad link juice
-
Hello,
I have noticed that there is plenty of old low quality links linking to many of the landing pages. I would like to cut them off and start again. Would it be ok to do the following?:
1. create new URLs (domain is quite string and new pages are ranking good and better than the affected old landing pages) and add the old content there
2. 302 redirect old landing pages to the new ones
3. put "no index" tag on the old URLs (maybe even "no index no follow"?)or it wouldn't work?
Thanks in advance
-
Hello all,
Thank you for your answers,
Oleg, I am not that keen on meta refresh, as it is poor user experience - apparently it needs to be about 10 sec, as shorter time G. may treat as 301. Wonder what is the shortest time I can use which will lose the link juice but wouldn't disturb my visitors.
Gagan, in regards to 301 redirecting the bad page to 404 page..isn't that easier just to make it 404 without redirect?
Mike, what do you think is the best solution to keep the traffic but cut off bad links to specific landing pages.
I will be testing 302 soon from old URL to new one. Wonder if I ALSO should put 404 on the old one...or maybe no index...or it doesn't matter? What are your thoughts?
-
Does it seems perfectly okay to make the site page (linked by spam links) to have 301 redirect to show 404 error page
As if its a CMS system where many other pages are linked through other subcategories too of the component, so the option of cutting down the bad page, which is hurt by low quality links is through 301 redirect to land to 404 error page. Will it diminish or rather make completely off the value of all spam links pointing to it and finally does not affect the site at all.
-
Upon further research, you are correct. A noindexed page is still crawled and indexed, just not in SERPs. So any links will still be followed and the page is still a part of the website. With this in mind, I think you should 404 the page and redirect via meta refresh after some time. Reach out to the webmaster's of the good links and ask them to change the new URL.
I still don't think a 302 is the way to go in this scenario. Ideally, you'd experiment with different options and see which produces the best results.
-
Personally I would go with Oleg's original suggestion: "If your rankings are being hurt by these links, I would move them to a new URL and 404 the old page. I would then go through the link profile for the old URLs. Find all the high quality links and contact the webmasters asking to change it to the new URLs."
-
Sure, But Oleg said, "If you noindex the page, G won't be able to access it and it will lose all its authority".
If in case the page loses all its authority - does it still will pass on the negative value to the domain or other pages due to low authority or spam backlinks pointing to it
If its true, then may be making the page cut off from site by marking it 404 is a better way !!
-
NoIndex won't cut the links. It will just remove the page from the SERPs. So you'll still be hit with the bad links to your site and organic traffic will be cut off.
-
Sure, thanks
Does it mean if we noindex it - can it be safely presumed that all the low quality links pointing to that url will be nullified and it will not have any negative effect to the site. I mean there wont be any need for making the page 404, if we still use that page as regular part of the site, like for filling forms etc.
Many thanks, once again for your detailed reply
-
So his goal is the have users redirect to the new page without having Google pass the link authority to the new URL.
If you noindex the page, G won't be able to access it and it will lose all its authority. But any user that visits the page will still be redirected to the new url. There is no such thing as a 404 redirect.
Meta refresh is another way to redirect users to a new page without passing authority. As long as the time is greater than 0 (meta refresh of time=0 is treated similar to a 301), it shouldn't pass authority. So same deal, noindex the page and set up a redirect for users, not bots.
-
Hello Oleg,
Am also interested in knowing more about it
Does marking a noindex, follow or noindex, nofollow to that page is a better way than 404 redirect ?
Also, i dint get you for meta refresh redirect. What does it mean like ?
-
302 by definition is "Temporary Redirect", which is not applicable here. According to this 302 experiment, 302's did actually pass some authority down (which may or may not hurt you). I do see the UX advantage to having the old URL redirect to the new page though.
Another alternative is to block the page via robots and set up a redirect or noindex the page and set a timed meta refresh redirect to the new page.
-
Thank you Oleg,
I have checked and have a few .gov.uk links going to some of those pages which generates some traffic, so not sure if 404 on them is the suitable in the situation.
On the other hand why 404 is better than 302? They both stop link juice passing but 302 passes the traffic.
-
If your rankings are being hurt by these links, I would move them to a new URL and 404 the old page. I would then go through the link profile for the old URLs. Find all the high quality links and contact the webmasters asking to change it to the new URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long Google will take to Disavow a link?
Just want to know how long will Google take to Disavow a link? I uploaded my file on 18 Dec 2020 and today is 5th January 2021 and still, that link is appearing in my Search Console in Top linking domains. Anyone who recently done this practice and how long it took? I mentioned the domain name below and hopefully, it will disavow all the links [subdomain+www+without www] coming from that domain. domain:abcd.com Help me out, please...
White Hat / Black Hat SEO | | seotoolsland.com0 -
Should I disavow links to a dead sub domain?
I'm analyzing a client's website today and I found that they have over 300 spammy sites linking to a subdomain of their main site. So for example, say their site is clientsite.com, well they have hundreds of links pointing to deadsite.clientsite.com. That subdomain was used at one time as a staging site, and is no longer active. Are those hundreds of spammy sites hurting or potentially hurting my client's SEO? Or is it a non-issue because the links point to a dead subdomain? We believe that that staging sub domain site was hacked at one time, and thats where all those spammy links came from. Should I disavow them?
White Hat / Black Hat SEO | | rubennunez0 -
Bad for SEO to have two very similar websites on the same server?
Is it bad for SEO to have two very similar sites on the same server? What's the best way to set this up?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Many Regional Pages: Bad for SEO?
Hello Moz-folks We are relatively well listed for "Edmonton web design." - the city we work out of. As an effort to reach out new clients, we created about 15 new pages targeting other cites in Alberta, BC and Saskatchewan. Although we began to show up quite well in some of these regions, we have recently seen our rankings in Edmonton drop by a few spots. I'm wondering if setting up regional pages that have lots of keywords for that region can be detrimental to our overall rankings.Here is one example of a regional page: http://www.web3.ca/red-deer-web-design Thanks, Anton TWeb3 Marketing Inc.
White Hat / Black Hat SEO | | Web3Marketing870 -
Ask Bloggers/Users To Link To Website
I have a web service that help bloggers to do certain tasks and find different partners. We have a couple of thousand bloggers using the service and ofcourse this is a great resource for us to build links from. The bloggers are all from different platforms and domains. Currently when a blogger login to the service we tell the blogger that if they write a blog post about us with their own words, and tell their readers what they think of our service. We will then give them a certain benifit within the service. This is clearly encouraging a dofollow-link from the bloggers, and therefore it's not natural link building. The strategy is however working quite good with about 150 new blog posts about our service per month, which both gives us a lot of new visitors and users, but also give us link power to increase our rankings within the SERP. Now to my questions: This is not a natural way of building links, but what is your opinion of this? Is this total black hat and should we be scared of a severe punishment from Google? We are not leaving any footprints more than we are asking the users for a link, and all blogposts are created with their own unique words and honest opinions. Since this viral marketing method is working great, we have no plans of changing our strategy. But what should we avoid and what steps should we take to ensure that we won't get in any trouble in the future for encouraging our users to linking back to us in this manner?
White Hat / Black Hat SEO | | marcuslind0 -
What are some of the worst links that you have come across?
I'm talking the least relevant and incredibly spammy. We've all done site audits and stumbled across some ridiculous ones. The funnier the better. I'm compiling a list of hilarious links that sites have gotten. Any input would be great!
White Hat / Black Hat SEO | | KevinBloom0 -
Webmaster Tools Showing Bad Links Removed Over 60 Days Ago
Hello, One of my clients received the notorious message from Google about unnatural links late last March. We've removed several hundred (if not thousands) of links, and resubmitted several times for reconsideration, only to continue with responses that state that we still have unnatural links. Looking through the "links to your site" in google webmaster tools, there are several hundred sites / pages listed, from which we removed our link over 60 days ago. If you click each link to view the site / page, they contain nothing, viewable or hidden, regarding our website / address. I was wondering if this (outdated / inaccurate) list is the same as the one their employees use to analyze the current status of bad links, and if so how long it will take to reflect up-to-date information. In other words, even though we've removed the bad links, how long do we need to wait until we can expect a clean resubmission for reconsideration. Any help / advice would be greatly appreciated -
White Hat / Black Hat SEO | | Bromtec0 -
Link Wheel & Unnatural Links - Undoing Damage
Client spent almost a year with link wheels and mass link blasts - end result was getting caught by google. I have taken over, we;ve revamped the site and I'm finishing up with onsite optimization. Would anyone have any suggestions how to undo the damage of the unnatural links and get back into googles favour a little quicker? Or the best next steps to undo the damage.
White Hat / Black Hat SEO | | ravynn0