Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to get a list of robots.txt file
-
This is my site.
Its in wordpress.I just want to know is there any way I can get the list of blocked URL by Robots.txt
In Google Webmaster its not showing up.Just giving the number of blocked URL's.
Any plugin or Software to extract the list of blocked URL's.
-
If you use Bing Webmaster tools you can see a complete list all URLs blocked by robots.txt. You can export the file and then filter.
Just go to Reports & Data > Crawl Information within your Bing webmaster account. I am not aware of this feature being in Google webmaster tools. Hope this helps.
-
simon_realbuzz buddy If I use this /classifieds/ it means I am blocking all URL starting with it.I want to get a list of all blocked URL's of site.
Example
http://muslim-academy.com/classifieds/
How many URL's associated with this classified are blocked by my robots.txt.
-
I'm sorry I don't follow. If you go to that URL you will see the list of blocked URLs as I've pasted below.
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /forum/viewtopic.php?p=
Disallow: /forum/viewtopic.php?=&p=
Disallow: /forum/viewtopic.php?t=
Disallow: /forum/viewtopic.php?start=
Disallow: /forum/&view=previousDisallow: /forum/&view=next
Disallow: /forum/&sid=
Disallow: /forum/&p=
Disallow: /forum/&sd=a
Disallow: /forum/&start=0
Disallow: /forum/memberlist.php
Disallow: /forum/posting.php
Disallow: /classifieds/
Disallow: /forum/index.php
Disallow: /forum/ucp
Disallow: /http://muslim-academy.com/الا�%A..
Disallow: /http://muslim-academy.com/особенн%D
Disallow: /http://muslim-academy.com/ислам-ка%
Disallow: /http://muslim-academy.com/classifieds/ads/Disallow: /http://muslim-academy.com/значени%D..
Disallow: /.ifieds/
Disallow: /.ifieds/ads/
Disallow: /forum/alternatelogin/al_tw_connect.php?authentication=1
Disallow: /forum/search.php -
simon_realbuzz I need a list of blocked URL's not the robots.txt file path.
-
You can view your robots file simply by appending /robots.txt to your site URL. Just put the following http://muslim-academy.com/robots.txt and you'll be able to view your robots file.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rogerbot directives in robots.txt
I feel like I spend a lot of time setting false positives in my reports to ignore. Can I prevent Rogerbot from crawling pages I don't care about with robots.txt directives? For example., I have some page types with meta noindex and it reports these to me. Theoretically, I can block Rogerbot from these with a robots,txt directive and not have to deal with false positives.
Reporting & Analytics | | awilliams_kingston0 -
PDF best practices: to get them indexed or not? Do they pass SEO value to the site?
All PDFs have landing pages, and the pages are already indexed. If we allow the PDFs to get indexed, then they'd be downloadable directly from google's results page and we would not get GA events. The PDFs info would somewhat overlap with the landing pages info. Also, if we ever need to move content, we'd now have to redirects the links to the PDFs. What are best practices in this area? To index or not? What do you / your clients do and why? Would a PDF indexed by google and downloaded directly via a link in the SER page pass SEO juice to the domain? What if it's on a subdomain, like when hosted by Pardot? (www1.example.com)
Reporting & Analytics | | hlwebdev1 -
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Will noindex pages still get link equity?
We think we get link equity from some large travel domains to white label versions of our main website. These pages are noindex because they're the same URLs and content as our main B2C website and have canonicals to the pages we want indexed. Question is, is there REALLY link equity to pages on our domain which have "noindex,nofollow" on them? Secondly we're looking to put all these white label pages on a separate structure, to better protect our main indexed pages from duplicate content risks. The best bet would be to put them on a sub folder rather than a subdomain, yes? That way, even though the pages are still noindex, we'd get link equity from these big domains to www.ourdomain.com/subfolder where we wouldn't to subdomain.ourdomain.com? Thank you!
Reporting & Analytics | | HTXSEO0 -
How to get multiple pages to appear under main url in search - photo attached
How do you get a site to have an organized site map under the main url when it is searched as in the example photo? SIte-map.png
Reporting & Analytics | | marketingmediamanagement0 -
Getting google impressions for a site not in the index...
Hi all Wondering if i could pick the brains of those wise than myself... my client has an https website with tons of pages indexed and all ranking well, however somehow they managed to also set their server up so that non https versions of the pages were getting indexed and thus we had the same page indexed twice in the engine but on slightly different urls (it uses a cms so all the internal links are relative too). The non https is mainly used as a dev testing environment. Upon seeing this we did a google remove request in WMT, and added noindex in the robots and that saw the index pages drop over night. See image 1. However, the site still appears to getting return for a couple of 100 searches a day! The main site gets about 25,000 impressions so it's way down but i'm puzzled as to how a site which has been blocked can appear for that many searches and if we are still liable for duplicate content issues. Any thoughts are most welcome. Sorry, I am unable to share the site name i'm afraid. Client is very strict on this. Thanks, Carl image1.png
Reporting & Analytics | | carl_daedricdigital0 -
Easiest way to get out of Google local results?
Odd one this, but what's the easiest way to remove a website from the Google local listings? Would removing all the Google map listings do the job? A client of ours is suffering massively since the Google update in the middle of last month. Previously they would appear no1 or no2 in the local results and normally 1 or 2 in the organic results. However, since the middle of last month any time they rank on the first page for a local result, their organic result has dropped massively to at least page 4. If I set my location as something different in google, say 100 miles away, they then rank well for the organic listings (obviously not appearing for local searches). When I change it back to my current location the organic listing is gone and they are back to ranking for the local. Since the middle of July the traffic from search engines has dropped about 65%. All the organic rankings remain as strong as ever just not in the areas where they want to get customers from!! The idea is to remove the local listing and get the organics reranking as the ctr on those is much much higher. On a side note, anyone else notice very poor ctr on google local listings? Maybe users feel they are adverts thanks
Reporting & Analytics | | ccgale0 -
Should you get a new Google Analytics account if your site has a new domain after a site redesign/new development?
We recently developed a new site for a client and they have opted to move forward with a domain change. Should we create a new Google Analytics account for the new site?
Reporting & Analytics | | TheOceanAgency0