Do I need to undo a 301 redirect to dissavow links from the source domain?
-
A client came to me after being hit by Penguin and had already performed a 301 redirect from site A to Site B. Site B was subsequently hit by the penalty a number of weeks later and we are planing on performing link removal for Site A.
Only the webmaster tools account for Site B exists, none is still available for site A. I assume that I cannot dissavow links to site A from Site B's webmaster tool account (even though website A's links show up in the GWT account).
So do I need to undo the 301 and then create a new GWT account for site A in order to disavow the links pointing to site A, or can I submit from Site B's GWT account since they are 301'd to site B?
Thanks!
Chris
[edited for formatting]
-
Yes, that should work.
-
Thinking about undoing the 301, confirming a new GWT account, submitting disavow, then reapplying 301 back to new site to avoid disrupting rankings. Any thoughts? I haven't heard of this being done before in this way..
Thanks for your help, hard to find resources on things this specific.
-
Yes, that's fine. The disavow won't take effect until the next time the Penguin penalty factors are recalculated which seems to be every 2 to 4 months.
-
Thanks!
Is it safe to go ahead and go through the link removal process first, then remove the 301 and disavow? I would prefer to wait as undoing the 301 will most likely cause a drop in other pages on Site B not affected by the penalty. So we would do all the link removal attempts, remove the 301, claim the new GWT account, and then lastly disavow.
I'm hearing of reports from a couple weeks to a couple months to wait for dissavow to be processed once submitted, any personal experience with this?
-
That's correct, Chris.When you go through the disavow process, you'll be prompted to select the domain to disavow links for from the list of domains managed by the Google account you're currently logged into GWT with.
Undo the 301; claim site A in GWT by verifying it with the meta tag, or Google Analytics, or the tiny downloaded file. Then, you can do a disavowal upload for site A.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitting url to link directories seen as un-natural link building?
Hi I have been a lurker for a long time, so I finally took the step to make my 1st post, and will hopefully start giving back more in the future since I have gained invaluable info from this great site Background I hired a new freelancer on our team of SEO consultants ("specialists") During the course a month he (the new consultant) submitted our website to numerous link directories (he assured me this is good), today I received the report of the work he had been doing for the past 4-weeks. I opened the report and I was furious and wanted to sack him there and then The Problem / My Question He had submitted our website to 150 directories with various levels of page rank, ranging from 7-1. Most of the directories are totally irrelevant to our niche (we are in the catering business) and he had gone and submitted the site to directories such as "finance busters", "questfinder" etc For all 150 submissions he used: exactly the same url exactly the same title exactly the same description exactly the same keywords My Concern Am I right to be worried about this? Or am I completely wrong and may this actually have an effect (even if none)? The way I see it is that Google is seeing 150 duplicate links coming from irrelevant directories all within a months time, which will trigger a red flag and possibly do major damage to my site, which has always been strictly white hat and been doing pretty well. p.s does link directory submissions even count these days anyway? Thanks for reading and advice very much welcome
White Hat / Black Hat SEO | | timthetanker0 -
Best practice to preserve the link juice to internal pages from expired domain?
This question relates to setting up an expired domain, that already has quality links, including deep links to internal pages. Since the new site structure will be different, what's the best practice to preserve the link juice to these internal pages? Export all the internal pages linked to using majestic Seo/ ahrefs etc, and set these pages previously linked to? Or 301 redirect these pages to home page? I heard there's a Wordpress plugin that 301 redirects all the 404 errors successfully preserving all the potential link juice.
White Hat / Black Hat SEO | | adorninvitations0 -
What is the difference between rel canonical and 301's?
Hi Guys I have been told a few times to add the rel canonical tag to my category pages - however every category page actually is different from the other - besides the listings that I have for my staff on each pages. Some of them specialise in areas that cross over in other areas - but over really if I'm re directing for eg: Psychic Readings over to Love and Relationships because 5 of my staff members are in both categories - the actual delivering of content and in depth of the actual category which skills are provided at different levels don't justify me creating a rel tag from Psychic Readings over to Love and Relationships just because i have 5 staff members listed under both categories. Tell me have I got this right or completely wrong? Here is an eg: Psychic Readings category https://www.zenory.com/psychic-readings And love and relationships category - https://www.zenory.com/love-relationships Hope this makes sense - I really look forward to your guys feedback! Cheers
White Hat / Black Hat SEO | | edward-may0 -
Link Juice Inquiry
Hello, So I have a website (example.com). I have an ajax pop-up (example.com/#example) that I am receiving a bunch of links to. Since this pop-up (example.com/#example) is on my homepage, are these links giving juice to the homepage, or this pop-up, or both?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
2 Questions about 301 Redirects
So I have a couple of questions about 301 redirects: Do Google penalties EVER pass through a 301? I've done 20+ domain 301s in the last year and have yet to see it happen, but the other day I read a an article (or maybe it was a QA post?) that suggested doing 302s to avoid transferring penalties. Has anyone seen any authoritative information regarding this? I 301'd a domain in February that another SEO firm had built a lot of spammy links and I began building contextual links for it at a very slow rate (like 10 or so a month). Within a month, my domain authority was a 26 on the new domain and my inbound links were non existent. By month 2, my links were 70k and domain authority was 34. By month 3, down to 25k inbound links and domain authority of 29, where it has settled for the last 3 months despite some really high quality links. My question (don't worry it's coming), is does anyone have any clue why my links shot up so quickly and then dropped? I'm assuming the 301 links kicked in and then only about 45% ended up 'sticking'?? Thanks in advance
White Hat / Black Hat SEO | | BrianJGomez0 -
Penalty for all new sites on a domain?
Hi @all, a friend has an interesting problem. He got a manuel link penalty in the end of 2011...it is an old domain with domainpop >5000 but with a lot bad links (wigdet and banners and other seo domains, but nothing like scrapebox etc)...he lost most of the traffic a few days after the notification in WMT (unnatural links) and an other time after the first pinguin update in april´12. In the end of 2012 after deleting (or nofollowing) and disavow a lot of links google lifted the manuel penalty (WMT notification). But nothing happened after lifting, the rankings didn´t improve (after 4 months already!). Almost all money keywords aren´t in the top 100, no traffic increases and he has good content on this domain. We built a hand of new trust links to test some sites but nothing improved. We did in february a test and build a completely new site on this domain, it´s in the menu and got some internal links from content...We did it, because some sites which weren´t optimized before the penalty (no external backlinks) are still ranking on the first google site for small keywords. After a few days the new site started to rank with our keyword between 40-45. That was ok and as we expected. This site was ranking constantly there for almost 6 weeks and now its gone since ten days. We didn´t change anything. It´s the same phenomena like the old sites on this domain...the site doesnt even rank for the title! Could it still be an manuel penalty for the hole domain or what kind of reasons are possible? Looking forward for your ideas and hope you unterstand the problem! 😉 Thanks!!!
White Hat / Black Hat SEO | | TheLastSeo0 -
301 Redirect ASP code
Hi I have a script detailed below, that 301 redirects based upon different queries --- """"<%if (Request("offset") = "") Then%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") <> "" then'Sector and Location NOT NULL%> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") = "" then'Sector NOT NULL and Location NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") = "" AND Request("j") <> "" then'Sector NULL and Location NOT NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <%End if%>"""" But this still allows for both the www and non www versions of these pages to render in the browser, which is resulting in duplicate content. On my home page I use -- <% If InStr(Request.ServerVariables("SERVER_NAME"),"www") = 0 Then Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www." & Request.ServerVariables("HTTP_HOST") & "/" Response.End End if %> `Is there a good way to combine these, so that I still get all of the rules of the first script whilst also redirecting any non www versions to the www version? in other words
White Hat / Black Hat SEO | | TwoPints
domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation")))
Eould redirect to
www.domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Thanks in advance`0