Robot.txt file issue on wordpress site.
-
I m facing the issue with robot.txt file on my blog.
Two weeks ago i done some development work on my blog. I just added few pages in robot file. Now my complete site seems to be blocked. I have checked and update the file and still having issue.
The search result shows that "A description for this result is not available because of this site's robots.txt – learn more."
Any suggestion to over come with this issue
-
What the problem is with your site I cant say because we don't have the url.
But having said that, Why block pages? I find it is very rare that you need to block pages, I remember reading that Google sees blocked resources as a spam signal, that by itself is not a problem but when mixed with other signals can be harmful.
so do you really need to block the pages?
if you really do, then use meta noindex, follow.
you want the links to be followed so that link juice flows back out of the links to indexed pages. or you will be pouring link juice away via any links pointing to noindex pages -
Hi
You are most likely still seeing "A description for this result...." etc in Google because they may not have re-cached the page yet.
If the site is crawlable with a tool like Screaming Frog SEO Spider, or a header checker like http://urivalet.com/ - it's also accessible to Google, you just need to wait for them to re-crawl.
Let us know if you need more help!
-
The most likely lines in your robots.txt to cause this:
User-agent: * Disallow: /
-
Hi there
Check out your robots.txt in Wordpress:
Admin Dashboard>Settings>General>Privacy -- what is setting?
I would also read up on Robots.txt and see if you can find anything that may stand out to you. Then, I would take a look at your XML sitemap and upload it to Google Webmaster Tools and Bing Webmaster Tools.
I, like the others, would like to see your robots.txt file as well to see a live example. This shouldn't be anything major!
-
It's so easy to block your entire site from Google with robots.txt. Are you using Yoast SEO as the SEO plugin? With this, there shouldn't really be any need to block anything yourself.
Drop your URL here and we can take a look.
-Andy
-
Could you post the URL to your robots.txt file or post the content of the file itself?
-
If you wish to block your pages from search engines, then you should use noindex and not disallow in robots.txt. In many cases, a robots,txt is doing more harm than good.
What is your URL?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix thin content issue?
Hello! I've checked my website via Moz and received "thin content" issue: "Your page is considered to have "thin content" if it has less than 50 words" But I definitely know that we have 5 text blocks with unique content, each block consist of more than 50 words. Do you have any ideas what may cause this issue? Thanks in advance, Yana
On-Page Optimization | | yanamazault0 -
Custom site to Wordpress transfer
Hi all, Custom site to Wordpress transfer was done a week back. Exact same folder paths and URL's, Analytics/adsense/webmaster etc should be added right? Webmaster where can i get the new code and will all old history be gone from webmaster? Thanks
On-Page Optimization | | klambimut20 -
SEO for E-Commerce Sites
Hi Everybody, I have two e-commerce sites just launched with not much content at the moment just user login pages for the clients to avail the service. The management is not interested to put much content there i think. Maximum what they will be putting only 5 pages of content in total, not more than this. Any practical tips how to optimize such sites especially when there is not much content. Best
On-Page Optimization | | Sequelmed0 -
Problem with getting a site to rank at all
We pushed this Word Press site live about a month ago www.primedraftarchitecture.com. Since then we've been adding regular content, blog posts 3 times a week with social posts on facebook, twitter, G+ and LinkedIn. We also submitted via Moz Local about 3 weeks ago. Yext about two weeks ago and have been adding about 5 listings to small local directories a week. Webmaster tools shows that the site map is valid and the pages of the site are getting indexed and it shows links from 7 sites, mostly directories. I'm just not seeing the site ranking for anything. We're getting zero organic traffic. I though we did a good job not over optimizing the pages. I'm just stymied trying to figure out what's wrong. Usually we push a site live and see at least some low rankings after just a couple of weeks. Can anyone see anything that looks bad or where we've gone wrong?
On-Page Optimization | | DonaldS0 -
Linking Out To External Sites
Hi, All If I have created a (website, logo, email campaign) for a client and written an article about it with screen shots on my website and link to them with a (do-follow) link how does Google see the (do-follow) link? Regards to the sites they have one link in the footer on the home page, which is a (do-follow) back to our site. Also, the websites are not in my Niche.
On-Page Optimization | | deskjet0 -
WordPress - duplicate content
I'm using WordPress for my website. However, whenever I use the post section for news, I get a report back from SEOmoz saying that there's duplicate content. What it does is it posts them in the Category and Archive section. Does anyone know if Google sees this as duplicate content and if so how to stop it? Thanks
On-Page Optimization | | AAttias0 -
Creating Duplicate Content on Shopping Sites
I have a client with an eCommerce site that is interested in adding their products to shopping sites. If we use the same information that is on the site currently, will we run into duplicate content issues when those same products & descriptions are published on shopping sites? Is it best practice to rewrite the product title and descriptions for shopping sites to avoid duplicate content issues?
On-Page Optimization | | mj7750 -
Wordpress categories tags and robots.txt
I am relatively new at this and see a variety of people that seem to disagree on if you should block google from indexing category and tag pages through robot.txt or no-follow because of google viewing it as duplicate content. I tryst this communities answers over the web at large obviosly, so what do you all think? Thanks, Steven
On-Page Optimization | | sfmatthews0