Recovering old disallow file?
-
Hi guys,
We had aN SEO agency do a disallow request on one of our sites a while back. They have no trace of the disallow txt file and all the links they disallowed.
Does anyone know if there is a way to recover this file in google webmaster tools or anyway to find which links were disallowed?
Cheers.
-
Have you performed any more disavow processes since? If you have not, simply login to your Search Console and head to the disavow section.
https://www.google.com/webmasters/tools/disavow-links
If you have not added any more, simply click the profile you are working with, and it should open up a box with a link to your most recent added file. See my linked image which shows the pop up. You can then download the text file that you/they added.
Hope that helps.
If it is an older file, I would suggest talking to google about seeing previous versions of the txt file, that is if they hold onto them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If my website do not have a robot.txt file, does it hurt my website ranking?
After a site audit, I find out that my website don't have a robot.txt. Does it hurt my website rankings? One more thing, when I type mywebsite.com/robot.txt, it automatically redirect to the homepage. Please help!
Intermediate & Advanced SEO | | binhlai0 -
How to 301 redirect old wordpress category?
Hi All, In order to avoid duplication errors we've decided to redirect old categories (merge some categories).
Intermediate & Advanced SEO | | BeytzNet
In the past we have been very generous with the number of categories we assigned each post. One category needs to be redirected back to blog home (removed completely) while a couple others should be merged. Afterwords we will re-categorize some of the old posts. What is the proper way to do so?
We are not technical, Is there a plugin that can assist? Thanks0 -
Using Reg Ex to 301 old categories and query strings in Magento
Hi SEOmoz community! I'm hoping somebody with a little Magento and Reg Ex knowledge will be able to help me out here. I need to 301 some old categories along with their old query strings. Below is an example. Old URL /bed-linen/pillowcases-html.html Users can then filter by price or range which then creates a query string such as... /bed-linen/pillowcases-html.html?price=1%2C10 New URL: /bed-linen/pillowcases.html So the new query string will be /bed-linen/pillowcases.html?price=1%2C10 Does anybody know the Reg Ex to 301 this? Can this be done in Magento re-write module or by htaccess only? Thanks in advance 🙂 Anthony @Anthony_Mac85
Intermediate & Advanced SEO | | Tone_Agency0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
If i disallow unfriendly URL via robots.txt, will its friendly counterpart still be indexed?
Our not-so-lovely CMS loves to render pages regardless of the URL structure, just as long as the page name itself is correct. For example, it will render the following as the same page: example.com/123.html example.com/dumb/123.html example.com/really/dumb/duplicative/URL/123.html To help combat this, we are creating mod rewrites with friendly urls, so all of the above would simply render as example.com/123 I understand robots.txt respects the wildcard (*), so I was considering adding this to our robots.txt: Disallow: */123.html If I move forward, will this block all of the potential permutations of the directories preceding 123.html yet not block our friendly example.com/123? Oh, and yes, we do use the canonical tag religiously - we're just mucking with the robots.txt as an added safety net.
Intermediate & Advanced SEO | | mrwestern0 -
Does this make sense to recover from panda?
Hello guys, our website was pandalized on 9/27/2012 and we haven't been able to recover since then. I've fixed as much as possible when it comes to poor content, and we have been getting high quality links consistently for the past 3-4 months. Our blog had some duplicate content issues due to categories, tags, feeds, etc. I solved those problems before the past 2 refreshes without success. I'm considering moving the blog to a subdomain, more than PR, I'm interested in recovering from panda, and let the blog grow on its own. What do you think about that?
Intermediate & Advanced SEO | | DaveMri0 -
Old pages still crawled by SE returning 404s. Better to put 301 or block with robots.txt ?
Hello guys, A client of ours has thousand of pages returning 404 visibile on googl webmaster tools. These are all old pages which don't exist anymore but Google keeps on detecting them. These pages belong to sections of the site which don't exist anymore. They are not linked externally and didn't provide much value even when they existed What do u suggest us to do: (a) do nothing (b) redirect all these URL/folders to the homepage through a 301 (c) block these pages through the robots.txt. Are we inappropriately using part of the crawling budget set by Search Engines by not doing anything ? thx
Intermediate & Advanced SEO | | H-FARM0 -
Page load increases with Video File - SEO Effects
We're trying to use a flash video as a product image, so the size increase will be significant. We're talking somewhere around 1.5 - 2mb on a page that is about 400kb before the video. So the increase is significant. There is SEO concern with pages peed and thinking perhaps having the flash video inside an iframe might overcome the speed issues. We're trying to provide a better experience with the video, but the increase in page size, and therefore speed, will be significant. The rest of the page will load, including a fallback static image, so we're really trying to understand how to mitigate the page load speed impact of the video. Any Thoughts?
Intermediate & Advanced SEO | | SEO-Team0