Robots.txt allows wp-admin/admin-ajax.php
-
Hello, Mozzers!
I noticed something peculiar in the robots.txt used by one of my clients:Allow: /wp-admin/admin-ajax.php
What would be the purpose of allowing a search engine to crawl this file?
Is it OK? Should I do something about it?
Everything else on /wp-admin/ is disallowed.
Thanks in advance for your help.
-AK:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking in Robots.txt and the re-indexing - DA effects?
I have two good high level DA sites that target the US (.com) and UK (.co.uk). The .com ranks well but is dormant from a commercial aspect - the .co.uk is the commercial focus and gets great traffic. Issue is the .com ranks for brand in the UK - I want the .co.uk to rank for brand in the UK. I can't 301 the .com as it will be used again in the near future. I want to block the .com in Robots.txt with a view to un-block it again when I need it. I don't think the DA would be affected as the links stay and the sites live (just not indexed) so when I unblock it should be fine - HOWEVER - my query is things like organic CTR data that Google records and other factors won't contribute to its value. Has anyone ever blocked and un-blocked and whats the affects pls? All answers greatly received - cheers GB
Technical SEO | | Bush_JSM0 -
"Url blocked by robots.txt." on my Video Sitemap
I'm getting a warning about "Url blocked by robots.txt." on my video sitemap - but just for youtube videos? Has anyone else encountered this issue, and how did you fix it if so?! Thanks, J
Technical SEO | | Critical_Mass0 -
Sharing/hosting of content questions...
I just wanted to get opinion on some of the fundamentals and semantics of optimisation and content generation/distribution - your thoughts and opinions are welcome. OK, for example, lets assume (for illustration purposes) that I have a site - www.examplegolfer.com aimed at golfers with golf related content. The keywords I would like to optimise for are: golf balls golf tees lowering your golf handicap drive a golf ball further Now, I'm going to be creating informative, useful content (infographics, articles, how to guides, video demonstrations etc) centred around these topics/keywords, which hopefully our audience/prospects will find useful and bookmark, share and monition our site/brand on the web, increasing (over time) our position of these terms/keywords in the SERP's. Now, once I've researched and created my content piece, where should I place it? Let's assume it's an infographic - should this be hosted on an infographic sharing site (such as Visually) or on my site, or both? If it's hosted or embedded on my site, should this be in a blog or on the page I'm optimising for (and I've generated my keyword around)? For example, if my infographic is around golf balls, should this be embedded on the page www.examplegolfer.com/golf-balls (the page I'm trying to optimise) and if so, and it's also placed elsewhere around the internet (i.e on Visually for example), this could technically be seen as duplicated content as the infographic is on my site and on Visually (for example)? How does everyone else share/distribute/host their created content in various locations whilst avoiding the duplicated content issue? Or have I missed something? Also, how important is it to include my keyword (golf balls) in the pieces' title or anchor text? Or indeed within the piece itself? One final question - should the content by authoured/shared as the brand/company or an individual (spokesperson if you like) on behalf of the company (i.e. John Smith)? I'm all for creating great, interesting, useful content for my audience, however I want to ensure we're getting the most out of it as researching influencers, researching the piece and creating it and distributing it isn't a quick or easy job (as we all know!). Thoughts and comments welcome. Thanks!
Technical SEO | | Carl2870 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Best way to create a shareable dynamic infographic - Embed / Iframe / other?
Hi all, After searching around, there doesn't seem to be any clear agreement in the SEO community of the best way to implement a shareable dynamic infographic for other people to put into their site. i.e. That will pass credit for the links to the original site. Consider the following example for the web application that we are putting the finishing touches on: The underlying site has a number of content pages that we want to rank for. We have created a number of infogrpahics showing data overlayed on top of a google map. The data continuously changes and there are javascript files that have to load in order to achieve the interactivity. There is one infographic per page on our site and there is a link at the bottom of the infographic that deep links back to each specific page on our site. What is the ideal way to implement this infographic so that the maximum SEO value is passed back to our site through the links? In our development version we have copied the youtube approach implemented this as an iframe. e.g. <iframe height="360" width="640" src="http://www.tbd.com/embed/golf" frameborder="0"></iframe>. The link at the bottom of that then links to http://www.tbd.com/golf This is the same approach that Youtube uses, however I'm nervous that the value of the link wont pass from the sites that are using the infographic. Should we do this as an embed object instead, or some other method? Thanks in advance for your help. James
Technical SEO | | jtriggs0 -
Using Robots.txt
I want to Block or prevent pages being accessed or indexed by googlebot. Please tell me if googlebot will NOT Access any URL that begins with my domain name, followed by a question mark,followed by any string by using Robots.txt below. Sample URL http://mydomain.com/?example User-agent: Googlebot Disallow: /?
Technical SEO | | semer0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0