Robot.txt file issue on wordpress site.
-
I m facing the issue with robot.txt file on my blog.
Two weeks ago i done some development work on my blog. I just added few pages in robot file. Now my complete site seems to be blocked. I have checked and update the file and still having issue.
The search result shows that "A description for this result is not available because of this site's robots.txt – learn more."
Any suggestion to over come with this issue
-
What the problem is with your site I cant say because we don't have the url.
But having said that, Why block pages? I find it is very rare that you need to block pages, I remember reading that Google sees blocked resources as a spam signal, that by itself is not a problem but when mixed with other signals can be harmful.
so do you really need to block the pages?
if you really do, then use meta noindex, follow.
you want the links to be followed so that link juice flows back out of the links to indexed pages. or you will be pouring link juice away via any links pointing to noindex pages -
Hi
You are most likely still seeing "A description for this result...." etc in Google because they may not have re-cached the page yet.
If the site is crawlable with a tool like Screaming Frog SEO Spider, or a header checker like http://urivalet.com/ - it's also accessible to Google, you just need to wait for them to re-crawl.
Let us know if you need more help!
-
The most likely lines in your robots.txt to cause this:
User-agent: * Disallow: /
-
Hi there
Check out your robots.txt in Wordpress:
Admin Dashboard>Settings>General>Privacy -- what is setting?
I would also read up on Robots.txt and see if you can find anything that may stand out to you. Then, I would take a look at your XML sitemap and upload it to Google Webmaster Tools and Bing Webmaster Tools.
I, like the others, would like to see your robots.txt file as well to see a live example. This shouldn't be anything major!
-
It's so easy to block your entire site from Google with robots.txt. Are you using Yoast SEO as the SEO plugin? With this, there shouldn't really be any need to block anything yourself.
Drop your URL here and we can take a look.
-Andy
-
Could you post the URL to your robots.txt file or post the content of the file itself?
-
If you wish to block your pages from search engines, then you should use noindex and not disallow in robots.txt. In many cases, a robots,txt is doing more harm than good.
What is your URL?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta Robots information
Hi, I have a question about the Meta Robots information Accoarding to the Moz bar, our page uses the meta robots noodp and noydir. Our competitor uses
On-Page Optimization | | AdoBike
INDEX,FOLLOW I read that noodp and noydir are dated and not used anymore. Is it wise to use INDEX FOLLOW instead for better SEO? Thanks in advance!1 -
On Site Question: Duplicate H2...
Hi All A few on-site audit tools pull information on duplicate H2 tags on pages. This implies it's a bad thing and should be fixed - is that the case? On one of my sites the tag-line is in H2 in the header, so appears on every page... Just wondering if this is something worth fixing. Thanks
On-Page Optimization | | GTAMP0 -
Problem with getting a site to rank at all
We pushed this Word Press site live about a month ago www.primedraftarchitecture.com. Since then we've been adding regular content, blog posts 3 times a week with social posts on facebook, twitter, G+ and LinkedIn. We also submitted via Moz Local about 3 weeks ago. Yext about two weeks ago and have been adding about 5 listings to small local directories a week. Webmaster tools shows that the site map is valid and the pages of the site are getting indexed and it shows links from 7 sites, mostly directories. I'm just not seeing the site ranking for anything. We're getting zero organic traffic. I though we did a good job not over optimizing the pages. I'm just stymied trying to figure out what's wrong. Usually we push a site live and see at least some low rankings after just a couple of weeks. Can anyone see anything that looks bad or where we've gone wrong?
On-Page Optimization | | DonaldS0 -
Does anyone know of an api for on site SEO?
I have searched for one, but really cannot find one that fits my needs. I am looking at making an on site grader / service that will check pages and point out SEO problems. One that I have found that I like is seorch.eu but they do not have an api. I do not want to reinvent the wheel if I do not have to. Also, the api does not have to be free, or it does not even have to be an api, it can be a self hosted application too.
On-Page Optimization | | LesleyPaone0 -
Site restructure question
Our site was deigned years ago to target customers in specific cities, now we've grown beyond this and I believe it is time to change the site structure.
On-Page Optimization | | PM_Academy
Ignore the 302 from the root page. Current structure: (assuming you've never been to our site before) projectmanagementacademy.net 302->/select-location.php /select-location.php -> /city-name/pmp-training.php This page was meant to be a "homepage" for each city, pointless page really /city-name/pmp-training.php -> /ciy-name/product-name.php These pages are for each individual product My suggested site structure: /city-name/pmp-training.php becomes projectmanagementacademy.net no more redirect /city-name/pmp-training.php gets removed and 301 to root page. /product-name.php each product's page and you would select a location when necessary (some products are online only) would 301 each /city-name/product-name to corresponding product page /product-name/city-name.php could add these pages if we still wanted the city name in url for city specific products My thoughts here are /product-name.php would receive a higher % of link juice because there are fewer page between 2 vs 4 if you came to the root page. and 2 vs 3 if you came from the select-location page. Also instead of being split between over 50 locations, all these would be together on one page. Your thoughts? Would this change improve our SERP for those product pages? Would we see a drop off in traffic if we did this? How long, if done correctly, would it take to see the recovery of rankings and traffic? Could we 301 /select-location.php to the root page? Thanks in advance for your insights to this. Any answer is a good answer. Trenton0 -
The correct way to go from PHP site to HTML site?
I have a website fully coded in PHP and I am doing a re-design over to an HTML site. I searched through the Q&A and there were some conflicting answers. Some said you will need to 301 all the pages. Others said to use the .htaccess to parse all the files as html. What is the correct way I should go about this? Thanks in advance!
On-Page Optimization | | reliabox0 -
Duplication issue on my website
hi I have a cms website with 2000 pages.my problem is that 1. www.test.com/abc.html 2. www.test.com/abc.html?gallery?123testing it showing duplication page in me seomoz error list. It is a single page. Please suggest solution for it
On-Page Optimization | | wmsindia0 -
Need help ranking my site
Hi, Can anyone help me out? I am trying to get this site ranked for "Villa General Belgrano". It was on the first page of Google and then it disappeared. Did I over optimize the anchor text? http://www.opensiteexplorer.org/anchors?site=www.lawebdelvalle.com.ar
On-Page Optimization | | Carla_Dawson0