Robot.txt file issue on wordpress site.
-
I m facing the issue with robot.txt file on my blog.
Two weeks ago i done some development work on my blog. I just added few pages in robot file. Now my complete site seems to be blocked. I have checked and update the file and still having issue.
The search result shows that "A description for this result is not available because of this site's robots.txt – learn more."
Any suggestion to over come with this issue
-
What the problem is with your site I cant say because we don't have the url.
But having said that, Why block pages? I find it is very rare that you need to block pages, I remember reading that Google sees blocked resources as a spam signal, that by itself is not a problem but when mixed with other signals can be harmful.
so do you really need to block the pages?
if you really do, then use meta noindex, follow.
you want the links to be followed so that link juice flows back out of the links to indexed pages. or you will be pouring link juice away via any links pointing to noindex pages -
Hi
You are most likely still seeing "A description for this result...." etc in Google because they may not have re-cached the page yet.
If the site is crawlable with a tool like Screaming Frog SEO Spider, or a header checker like http://urivalet.com/ - it's also accessible to Google, you just need to wait for them to re-crawl.
Let us know if you need more help!
-
The most likely lines in your robots.txt to cause this:
User-agent: * Disallow: /
-
Hi there
Check out your robots.txt in Wordpress:
Admin Dashboard>Settings>General>Privacy -- what is setting?
I would also read up on Robots.txt and see if you can find anything that may stand out to you. Then, I would take a look at your XML sitemap and upload it to Google Webmaster Tools and Bing Webmaster Tools.
I, like the others, would like to see your robots.txt file as well to see a live example. This shouldn't be anything major!
-
It's so easy to block your entire site from Google with robots.txt. Are you using Yoast SEO as the SEO plugin? With this, there shouldn't really be any need to block anything yourself.
Drop your URL here and we can take a look.
-Andy
-
Could you post the URL to your robots.txt file or post the content of the file itself?
-
If you wish to block your pages from search engines, then you should use noindex and not disallow in robots.txt. In many cases, a robots,txt is doing more harm than good.
What is your URL?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On-Page Optimization on a service based Site?
Respected members, my question is that if I Wanna run a website that provide service like olansi do, But i need to know the On -page effect like How Can I make a plan to secure my position against my Service based competitor site?
On-Page Optimization | | younus_7831 -
Reducing number of site pages?
Hi, I am looking through my site structure and I have a lot of pages left over from the days of article keywords. Probably 7 or 8 years ago, someone sold my husband on article key word pages. I have slowly gotten rid of a lot of them as they have fallen out out of the ranks. I would like to get rid of the rest, probably 5 or 6 pages. Will it hurt my rankings to delete pages and redirect them? My customers really like the simplicity of our site and I want to keep it that way, plus clean up flags that Moz is telling me is a problem. I think its easier to keep less pages top notch than have to worry with a lot of them. Especially since my customers aren't viewing them. Thanks in advance!
On-Page Optimization | | CalicoKitty20000 -
Responsive site.com vs m.site.com
Hi All, My client's website have two urls like: site.com/a.html and **m.site.com/a.html. ** Will it hurt google rankings for this website because there are version of a website? Please help!
On-Page Optimization | | binhlai1 -
Translated the site but traffic is not coming
Hi, We've build a lawyer directory website (www.iranianlawyers.com) which already has good Google rankings for related terms (Ex. Iranian lawyers, Iranian lawyers california, etc). About 1.5 months ago we translated the site to Farsi and published it online: www.iranianlawyers.com/farsi However we have yet to see any new traffic generated from those pages. The website has a decent back-link profile and there are no almost no competitors in our space with translated pages. Would someone please take a look at our translated pages and let me know if there are any major on-site issues that you see that we need to address? I've checked for noindex or nofollow tags but they dont seem to be an issue. Not sure if I'm missing something here. Thank you very much
On-Page Optimization | | Heydarian0 -
WordPress (.com) and SEO
I am in my 30 day trial and very interested in my results. I think I am probably in a small minority in having the same web site up and running for approaching 17 years (registered in January 1995 :)) but only now am I looking at SEO seriously (to the extent that I want to learn more myself, as opposed to having others promise great fortune!)). Anyway, before committing to SEOMoz on an ongoing basis I want to understand just how actionable the information on my dashboard is. With that in mind, here's the first of what is (hopefully) a series questions that about low-hanging fruit I might be able to check off quickly. I recently brought up a new blog on WordPress.com (note - hosted by WordPress, not a self-hosted implementation). I have had this blog running for less than a month and have just 18 posts. And I am being overwhelmed with thoudands of errors/warnings from SEOMoz. These fall into a few categories: Duplicate content: As I understand it, each TAG I associate with a single blog post creates a unique URL. For example, if I have a single post with tags for "flowers", "wine" and "cakes", I get URLs generated such as <blog url="">/flowers, <blog url="">/wines and <blog url="">/cakes. Obviously, tagging posts is a common scenario. Must I just accept these duplicate content warnings?</blog></blog></blog> Title element too long: These are self generated by WordPress.com and the default format includes the date the post was submitted (which takes a bunch of characters followed by the title used). Many of the posts are well over 70 and this seems really easy to do. Missing meta-description: As far as I can tell, Wordpress.com doesn't give me an option to specify these. So, must I just accept these issues if I use WordPress.com (which, again, seems like a very common scenario) and how negative is this to me? Thanks. Mark
On-Page Optimization | | MarkWill0 -
Home page duplicated content issue
Hi there! The home page of my site can be seen under www.mysitename.com and www.mysitename.com/EN/ or www.mysitename.com/ES/ (depending on your language). I understand that this is duplicated content because they show the same content under different URLs. To solve this we've done (depending on your language) a 301 redirect from www.mysitename.com to www.mysitename.com/EN/ or www.mysitename.com/ES/ Is this correct? Thanks!
On-Page Optimization | | Xopie0 -
Self-Cannibalization issue
Is the keyword "filme online gratis" self-cannibalization on this site filmeonlinenoi.com in the seomoz tool "On-Page Keyword Optimization" it shows that it is a self-cannibalization keyword ... i made some changes (big changes) and its still remaining the same
On-Page Optimization | | Alexsmenaru0 -
Duplicate page title issues with a CMS
I am using MODx as a CMS on a site and trying to eliminate duplicate page titles. url.com/ url.com/[~897~] which is really {~897~} its a resource number. url.com/home/ How can I resolve this issue when its all one page in the CMS? thanks
On-Page Optimization | | tjsherrill0