Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can I dissavow links on a 301'd website?
-
So we are performing link removal for a client on his old website (A), which is being 301 redirected to his new website (B). We have identified toxic links on site A and are removing, once complete we will undo the current 301, confirm a new GWT account for website A, and then submit the disavow report.
We would then like to reapply the 301 redirect to site B while we are waiting for Google to process the disavow report, the logic being we can retain some current rankings on site B while waiting for the disavow to process on site A.
Has anyone had experience with this method? I foresee some potential issues here but am interested to here from others on this. Thanks!
-
I tend to agree with Federico's concerns. If the 301 transfers a penalty, the impact could be long-term, and it could be harder to rescue site B. The short-term ranking gains may not be worth it.
Google hasn't been clear on how this operates with 301 redirects. John's suggestion to disavow on both sites seems safe. Worst case, it's wasted effort, but it's not much effort (once you've built one file, building two is easy). Still, you've got to wait for that to process, and if the algorithmic penalty is something like Penguin, then you'd have to wait for a data refresh. This could take months, so I'd be really hesitant to risk site B until you've cleaned up the mess.
Once you disavow to site A, the 301-redirect should be fairly safe, but it does depend on the extent of the penalty. The risk/reward trade-off is definitely a "devil is in the details" sort of situation.
-
Well, you are right, manual are easier to fix, although most likely, sites with manual penalties usually fall into an algorithmic penalty too.
Steps I'd suggest:
- Don't reinstate the redirect.
- Do some cleaning, extensive cleaning.
- Use it just as a redirection for users, but not for crawlers, rankings (using robots.txt disallow site A and 302 redirect the domain to site B).
Hope that helps!
-
No, this is an algorithmic penalty. Wish it was manual, would be easier to figure out.
-
But did you get any MANUAL penalty on A or B?
-
The problem is that despite the algorithmic penalty site A appears to be pushing heavy authority to site B and keeping decent rankings for some very competitive terms that we otherwise would not rank for with site B. If I remove the 301 I fully expect all current rankings to drop, I am trying to avoid this.
Were doing link removal now, but plan on having to use the disavow tool once we have a few removal requests out to webmasters. I actually got an answer on this from John Mueller at Google in the technical SEO community on G+.
John Mueller
"I would think about the final state you want to be in and just do that. If you want to do a domain move, then 301 and keep them. If you do a domain move + disavow links, then submit the file for both domains. This process will take quite some time (maybe even a year), so you don't want to play with it incrementally: just find out what you want in the end and set that up." -
Hey Chris,
Did site A or B receive a manual penalty?
As any penalty on A, which is 301'd to B, will ultimately pass the penalty to B. I would suggest removing the 301 ASAP. Then cleanup the A domain until it's clean (if a manual action, until it's revoked) and then you can think of putting the 301 back.
Removing a manual penalty could be a long process, it took 1 year for us and 4 reconsideration requests to get the penalty revoked. We had to use the disavow as a machete as disavowed almost our entire link profile leaving aside the domains that we knew were good links, all others were disavowed using the "Domain:" to avoid any missed link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hacked website - Dealing with 301 redirects and a large .htaccess file
One of my client's websites was recently hacked and I've been dealing with the after effects of it. The website is now clean of malware and I already appealed to Google about the malware issue. The current issue I have is dealing with the 20, 000+ crawl errors which are garbage links that were created from the hacking. How does one go about dealing with all the 301 redirects I need to create for all the 404 crawl errors? I'm already noticing an increased load time on the website due to having a rather large .htaccess file with a couple thousand 301 redirects done already which I fear will result in my client's website performance and SEO performance taking a hit as well.
Intermediate & Advanced SEO | | FPK0 -
Can a large fluctuation of links cause traffic loss?
I've been asked to look at a site that has lost 70/80% if their search traffic. This happened suddenly around the 17th April. Traffic dropped off over a couple of days and then flat-lined over the next couple of weeks. The screenshot attached, shows the impressions/clicks reported in GWT. When I investigated I found: There had been no changes/updates to the site in question There were no messages in GWT indicating a manual penalty The number of pages indexed shows no significant change There are no particular trends in keywords/queries affected (they all were.) I did discover that ahrefs.com showed that a large number of links were reported lost on the 17th April. (17k links from 1 domain). These links reappeared around the 26th/27th April. But traffic shows no sign of any recovery. The links in question were from a single development server (that shouldn't have been indexed in the first place, but that's another matter.) Is it possible that these links were, maybe artificially, boosting the authority of the affected site? Has the sudden fluctuation in such a large number of links caused the site to trip an algorithmic penalty (penguin?) Without going into too much detail as I'm bound by client confidentiality - The affected site is really a large database and the links pointing to it are generated by a half dozen or so article based sister sites based on how the articles are tagged. The links point to dynamically generated content based on the url. The site does provide a useful/valuable service/purpose - it's not trying to "game the system" in order to rank. That doesn't mean to say that it hasn't been performing better in search than it should have been. This means that the affected site has ~900,000 links pointing to is that are the names of different "entities". Any thoughts/insights would be appreciated. I've expresses a pessimistic outlook to the client, but as you can imaging they are confused and concerned. LVSceCN.png
Intermediate & Advanced SEO | | DougRoberts0 -
Can an incorrect 301 redirect or .htaccess code cause 500 errors?
Google Webmaster Tools is showing the following message: _Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. _ Before I contact the person who manages the server and hosting (essentially asking if the error is on his end) is there a chance I could have created an issue with an incorrect 301 redirect or other code added to .htaccess incorrectly? Here is the 301 redirect code I am using in .htaccess: RewriteEngine On RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/.]+/)*(index.html|default.asp)\ HTTP/ RewriteRule ^(([^/.]+/)*)(index|default) http://www.example.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} !^(www.example.com)?$ [NC] RewriteRule (.*) http://www.example.com/$1 [R=301,L] Could adding the following code after that in the .htaccess potentially cause any issues? BEGIN EXPIRES <ifmodule mod_expires.c="">ExpiresActive On
Intermediate & Advanced SEO | | kimmiedawn
ExpiresDefault "access plus 10 days"
ExpiresByType text/css "access plus 1 week"
ExpiresByType text/plain "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 week"
ExpiresByType application/x-icon "access plus 1 year"</ifmodule> END EXPIRES (Edit) I'd like to add that there is a Wordpress blog on the site too at www.example.com/blog with the following code in it's .htaccess: BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule> END WordPress Thanks0 -
How to find affiliate sites linking to a competitor website?
Hello here, I am trying to understand the best way to find sites that are affiliate of a competitor, through link research. Typically our competitor's affiliates link to our competitor website via any of the following links: http://www.musicnotes.com/sheetmusic/ard.asp?SID=[aff_id]&LID=[link_id] http://click.linksynergy.com/link?id=[aff+id]&offerid=[off_id]&type=2&murl=http%3A%2F%2Fwww.musicnotes.com%2Fsheetmusic%2Fmtd.asp%3Fppn%3D[item_id] The first link looks much easier to find, so I have tried to find the first kind of links with Google by using the "link:" clause as follows: link:http://www.musicnotes.com/sheetmusic/ard.asp Or, similarly, by using Open Site Explorer. But I always get 0 results! It is weird because I know there are thousands of affiliates out there with the same tracking code. How's that possible? Why does it look impossible to find the sites I am looking for? Would you suggest any different approach? Any ideas, suggestions and thoughts are very welcome! Thank you in advance. Fab.
Intermediate & Advanced SEO | | fablau0 -
Link Research Tools - Detox Links
Hi, I was doing a little research on my link profile and came across a tool called "LinkRessearchTools.com". I bought a subscription and tried them out. Doing the report they advised a low risk but identified 78 Very High Risk to Deadly (are they venomous?) links, around 5% of total and advised removing them. They also advised of many suspicious and low risk links but these seem to be because they have no knowledge of them so default to a negative it seems. So before I do anything rash and start removing my Deadly links, I was wondering if anyone had a). used them and recommend them b). recommend detoxing removing the deadly links c). would there be any cases in which so called Deadly links being removed cause more problems than solve. Such as maintaining a normal looking profile as everyone would be likely to have bad links etc... (although my thinking may be out on that one...). What do you think? Adam
Intermediate & Advanced SEO | | NaescentAdam0 -
DNS or 301 Website Redirect
We are running a marketplace site, so we have thousands of vendors selling their products on our site. Each vendor has a Profile page and we are soon to launch a premium store-front that is white label. Many of these vendors will want to point a custom url to their premium store-front (which is a sub domain of the marketplace) and we are trying to get an understanding of how we should instruct them to point their url in a way that will give the main marketplace site the seo juice. We also want to understand what will show up in the address bar. Will it be their url or our sub domain? Will any of the marketplace seo juice boost their url local listing status?
Intermediate & Advanced SEO | | bloomnation0 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0