Robot.txt file issue on wordpress site.
-
I m facing the issue with robot.txt file on my blog.
Two weeks ago i done some development work on my blog. I just added few pages in robot file. Now my complete site seems to be blocked. I have checked and update the file and still having issue.
The search result shows that "A description for this result is not available because of this site's robots.txt – learn more."
Any suggestion to over come with this issue
-
What the problem is with your site I cant say because we don't have the url.
But having said that, Why block pages? I find it is very rare that you need to block pages, I remember reading that Google sees blocked resources as a spam signal, that by itself is not a problem but when mixed with other signals can be harmful.
so do you really need to block the pages?
if you really do, then use meta noindex, follow.
you want the links to be followed so that link juice flows back out of the links to indexed pages. or you will be pouring link juice away via any links pointing to noindex pages -
Hi
You are most likely still seeing "A description for this result...." etc in Google because they may not have re-cached the page yet.
If the site is crawlable with a tool like Screaming Frog SEO Spider, or a header checker like http://urivalet.com/ - it's also accessible to Google, you just need to wait for them to re-crawl.
Let us know if you need more help!
-
The most likely lines in your robots.txt to cause this:
User-agent: * Disallow: /
-
Hi there
Check out your robots.txt in Wordpress:
Admin Dashboard>Settings>General>Privacy -- what is setting?
I would also read up on Robots.txt and see if you can find anything that may stand out to you. Then, I would take a look at your XML sitemap and upload it to Google Webmaster Tools and Bing Webmaster Tools.
I, like the others, would like to see your robots.txt file as well to see a live example. This shouldn't be anything major!
-
It's so easy to block your entire site from Google with robots.txt. Are you using Yoast SEO as the SEO plugin? With this, there shouldn't really be any need to block anything yourself.
Drop your URL here and we can take a look.
-Andy
-
Could you post the URL to your robots.txt file or post the content of the file itself?
-
If you wish to block your pages from search engines, then you should use noindex and not disallow in robots.txt. In many cases, a robots,txt is doing more harm than good.
What is your URL?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404's Wordpress products
Hi Guy's, On a Wordpress website we have a SEO Ultimate plugin running. Every day i get lot's of 404 errors of products that doesn't exist anymore (but are indexed, site: .... ). In the beginning we had lot's of testproduct that are not coming back in the shop. So i was wondering if there is a way to automaticly redirect product when there are out of stock, or not comming back anymore... So my 404's can be fixed. Thanks!
On-Page Optimization | | Happy-SEO1 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Site restructure question
Our site was deigned years ago to target customers in specific cities, now we've grown beyond this and I believe it is time to change the site structure.
On-Page Optimization | | PM_Academy
Ignore the 302 from the root page. Current structure: (assuming you've never been to our site before) projectmanagementacademy.net 302->/select-location.php /select-location.php -> /city-name/pmp-training.php This page was meant to be a "homepage" for each city, pointless page really /city-name/pmp-training.php -> /ciy-name/product-name.php These pages are for each individual product My suggested site structure: /city-name/pmp-training.php becomes projectmanagementacademy.net no more redirect /city-name/pmp-training.php gets removed and 301 to root page. /product-name.php each product's page and you would select a location when necessary (some products are online only) would 301 each /city-name/product-name to corresponding product page /product-name/city-name.php could add these pages if we still wanted the city name in url for city specific products My thoughts here are /product-name.php would receive a higher % of link juice because there are fewer page between 2 vs 4 if you came to the root page. and 2 vs 3 if you came from the select-location page. Also instead of being split between over 50 locations, all these would be together on one page. Your thoughts? Would this change improve our SERP for those product pages? Would we see a drop off in traffic if we did this? How long, if done correctly, would it take to see the recovery of rankings and traffic? Could we 301 /select-location.php to the root page? Thanks in advance for your insights to this. Any answer is a good answer. Trenton0 -
Moving content from one site to another
I have a couple established, content rich sites with some content that I would like to move over to a new site. My question is what steps I need to take to ensure that neither my older sites nor newer sites are penalized for duplicate content. The purpose for moving the content is to add some depth to the new site for users, as well as possibly optimize it all for SEO. There is a fair amount of content involved, about 50 posts and pages per site, so I'd like to know if the potential problem with duplicate content might be serious enough that I should think twice. What do you recommend?
On-Page Optimization | | LeeAbrahamson0 -
Site is not ranking for a particular keyword !!
One of my site is ranking for all the main keywords except one. This keyword is just a variant of those keywords which are all ranking in top 10 (page 1) in Google. Why is it happening? Does Google punishes site for one keyword. I know competition of keyword matters but other keywords with similar competition are ranking. And even the site is very well optimized for this keyword (titles and site copy without any stuffing) Any Solutions ?
On-Page Optimization | | Personnel_Concept0 -
What are the best eCommerce sites from an SEO perspective?
We're working hard on improving our website right now, and would love to get the community's examples of the best eCommerce sites out there, from an SEO and a general customer-centric design perspective...
On-Page Optimization | | reddogmusic0 -
Duplicate content on area specific sites
I have created some websites for my company Dor-2-Dor and there is a main website where all of the information across the board is on (www.dor2dor.com) but I also have area specific sites which are for our franchisees who run certain areas around the country (www.swansea.dor2dor.com or www.oxford.dor2dor.com) The problem is that the content that is on a lot of the pages is the same on all of them for instance our faq's page, special offers etc. What is the best way to get these pages to rank well and not have the duplicate content issues and be ranked down by search engines? Any help will be greatly received.
On-Page Optimization | | D2DWeb0 -
What reasons exist to use noindex / robots.txt?
Hi everyone. I realise this may appear to be a bit of an obtuse question, but that's only because it is an obtuse question. What I'm after is a cataloguing of opinion - what reasons have SEOs had to implement noindex or add pages to their robots.txt on the sites they manage?
On-Page Optimization | | digitalstream0