Robot.txt file issue on wordpress site.
-
I m facing the issue with robot.txt file on my blog.
Two weeks ago i done some development work on my blog. I just added few pages in robot file. Now my complete site seems to be blocked. I have checked and update the file and still having issue.
The search result shows that "A description for this result is not available because of this site's robots.txt – learn more."
Any suggestion to over come with this issue
-
What the problem is with your site I cant say because we don't have the url.
But having said that, Why block pages? I find it is very rare that you need to block pages, I remember reading that Google sees blocked resources as a spam signal, that by itself is not a problem but when mixed with other signals can be harmful.
so do you really need to block the pages?
if you really do, then use meta noindex, follow.
you want the links to be followed so that link juice flows back out of the links to indexed pages. or you will be pouring link juice away via any links pointing to noindex pages -
Hi
You are most likely still seeing "A description for this result...." etc in Google because they may not have re-cached the page yet.
If the site is crawlable with a tool like Screaming Frog SEO Spider, or a header checker like http://urivalet.com/ - it's also accessible to Google, you just need to wait for them to re-crawl.
Let us know if you need more help!
-
The most likely lines in your robots.txt to cause this:
User-agent: * Disallow: /
-
Hi there
Check out your robots.txt in Wordpress:
Admin Dashboard>Settings>General>Privacy -- what is setting?
I would also read up on Robots.txt and see if you can find anything that may stand out to you. Then, I would take a look at your XML sitemap and upload it to Google Webmaster Tools and Bing Webmaster Tools.
I, like the others, would like to see your robots.txt file as well to see a live example. This shouldn't be anything major!
-
It's so easy to block your entire site from Google with robots.txt. Are you using Yoast SEO as the SEO plugin? With this, there shouldn't really be any need to block anything yourself.
Drop your URL here and we can take a look.
-Andy
-
Could you post the URL to your robots.txt file or post the content of the file itself?
-
If you wish to block your pages from search engines, then you should use noindex and not disallow in robots.txt. In many cases, a robots,txt is doing more harm than good.
What is your URL?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Fix Google Webmasters Soft 404 Errors for Wordpress Site?
I am getting Soft 404 errors in my Google Webmasters Tools don't know how to fix it the site in on WordPress CMS the site is "http://appdictions.com/" i am getting errors in the http://appdictions.com/members section...suggestion to fix the issue will be appreciated..
On-Page Optimization | | preferati0 -
Locating broken links on site?
Hey guys, I'm using Screaming Frog to help locate some broken links on a client's site and I've managed to pick up two. However, I can't seem to find whereabouts they're located on the site in order to fix them! Is there a way I can do this? Cheers!
On-Page Optimization | | Webrevolve0 -
Are lots of wordpress tags hurting my SEO?
When I started my blog 4 years ago and didn't know what I was doing, I set it up to be organized using tags (no categories). I have around 20 tags and around 250 articles. Each blog article typically has 2-4 tags associated with it. There is a lot of topic overlap and putting each post in only one category wouldn't work. I have been no indexing, do follow my tag pages because they are just a list of post titles, but I'm wondering if the whole situation is hurting my SEO. On my homepage I have a sidebar menu/list of all the tags so I have link juice going to no indexed (but dofollow) pages. Reducing the number of tags would be bad for user experience because people who were looking for info on a specific topic wouldn't be as easily able to find what they want. Ideas?
On-Page Optimization | | KateV0 -
Site structure suggestions/feedback
I asked this on Reddit and got some some decent answers. I'm curious to see what the pro's of SEOmoz think. I've got a lead generation site for forklift parts--liftxparts.com. You can think of it similar to car parts, where we have sections for specific brands (e.x. Toyota forklift parts) and sections for specific categories (e.x. forklift filters). Right now, the site is structured in two main levels: the top level is a dozen or so brands (separate pages for Toyota forklift parts, Clark forklift parts, etc), and then the second level is the categories (separate pages for a dozen or so different categories like forklift filters, forklift engine parts, etc.).If you check out one of the pages, like Clark forklift parts for example (our top landing page)--liftxparts.com/clark-forklift-parts.html, you'll see that on the brand pages (they're all structured the same), we list all the different categories (with links to the same second level category pages) and "search" buttons. All pages point to the same lead capture form.This has been working pretty well--about 90% of visitors end up on our lead capture form, and a high percentage of those convert. We're working on increasing organic traffic now and I'm thinking our structure could use some improvement.Looking at the analytics, there are a lot more impressions for keywords like "clark forklift" than "clark forklift parts". One gap I've uncovered is while our average position, and by extension CTR and traffic, for phrases like "clark forklift parts" is quite good, it's not so good for broader and higher searched terms like "clark forklift". Should we add another level of hierarchy targeted to just general brands? So now we have content for clark forklift parts, but should we add a page for terms like "clark forklift"? Or should we just add some broader content to the existing brand pages? The pages are quite long already, I'm afraid adding more content to the bottom of the page isn't very functional. Our thinking is that we can increase average position for higher searched terms by adding content targeted to those terms. The question is how exactly to go about it and how to work it into our current site structure? Any feedback related to our site structure or even just related ideas about other ways to approach our goal of increasing organic traffic would be very much appreciated! Thanks!
On-Page Optimization | | wisamabdulla0 -
Working on this site...
and wondering what is wrong in terms of on page SEO (basically just want some feedback on tips/changes to make) http://www.stevenholmesstudio.com/ I'm assuming that the title shouldn't be just the img file name..any suggestions for what it should be?
On-Page Optimization | | callmeed0 -
Best way to optimize site for Google Maps?
I am working with a site right now and they are ranked #1
On-Page Optimization | | WhiteHat12
for many keyword phrases based on their location and service. Their service is an
Insurance agency, so they rank #1 for many keywords like “Miami Insurance” or “South
Florida Insurance Agency” (their not actually ranked for Miami
just giving an example). I also include their address on every page of the site
(maybe that better helps Google maps?). Problem I am having is when searched
just the keyword phrase they rank number 1 but when searching say “insurance”
while being logged in to the area they rank for they do not come up. I hear that
there might be specific ways to optimize for this. What I would like to know is
what would I have to do to optimize for Google maps and what’s everything I possibly
can do. I am good with search engine optimization but have never really dabbled
much with Google Maps, I always thought they just ranked you based on your
address.0 -
Duplicate Product BUT Unique Content -- any issues?
We have the situation where a group of products fit into 2 different categories and also serve different purposes (to the customer). Essentially, we want to have the same product duplicated on the site, but with unique content and it would even have a slightly different product name. Some specifications would be redundant, but the core content would be different. Any issues?
On-Page Optimization | | SEOPA1 -
Does Google respect User-agent rules in robots.txt?
We want to use an inline linking tool (LinkSmart) to cross link between a few key content types on our online news site. LinkSmart uses a bot to establish the linking. The issue: There are millions of pages on our site that we don't want LinkSmart to spider and process for cross linking. LinkSmart suggested setting a noindex tag on the pages we don't want them to process, and that we target the rule to their specific user agent. I have concerns. We don't want to inadvertently block search engine access to those millions of pages. I've seen googlebot ignore nofollow rules set at the page level. Does it ever arbitrarily obey rules that it's been directed to ignore? Can you quantify the level of risk in setting user-agent-specific nofollow tags on pages we want search engines to crawl, but that we want LinkSmart to ignore?
On-Page Optimization | | lzhao0