Robot.txt file issue on wordpress site.
-
I m facing the issue with robot.txt file on my blog.
Two weeks ago i done some development work on my blog. I just added few pages in robot file. Now my complete site seems to be blocked. I have checked and update the file and still having issue.
The search result shows that "A description for this result is not available because of this site's robots.txt – learn more."
Any suggestion to over come with this issue
-
What the problem is with your site I cant say because we don't have the url.
But having said that, Why block pages? I find it is very rare that you need to block pages, I remember reading that Google sees blocked resources as a spam signal, that by itself is not a problem but when mixed with other signals can be harmful.
so do you really need to block the pages?
if you really do, then use meta noindex, follow.
you want the links to be followed so that link juice flows back out of the links to indexed pages. or you will be pouring link juice away via any links pointing to noindex pages -
Hi
You are most likely still seeing "A description for this result...." etc in Google because they may not have re-cached the page yet.
If the site is crawlable with a tool like Screaming Frog SEO Spider, or a header checker like http://urivalet.com/ - it's also accessible to Google, you just need to wait for them to re-crawl.
Let us know if you need more help!
-
The most likely lines in your robots.txt to cause this:
User-agent: * Disallow: /
-
Hi there
Check out your robots.txt in Wordpress:
Admin Dashboard>Settings>General>Privacy -- what is setting?
I would also read up on Robots.txt and see if you can find anything that may stand out to you. Then, I would take a look at your XML sitemap and upload it to Google Webmaster Tools and Bing Webmaster Tools.
I, like the others, would like to see your robots.txt file as well to see a live example. This shouldn't be anything major!
-
It's so easy to block your entire site from Google with robots.txt. Are you using Yoast SEO as the SEO plugin? With this, there shouldn't really be any need to block anything yourself.
Drop your URL here and we can take a look.
-Andy
-
Could you post the URL to your robots.txt file or post the content of the file itself?
-
If you wish to block your pages from search engines, then you should use noindex and not disallow in robots.txt. In many cases, a robots,txt is doing more harm than good.
What is your URL?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two Day After Starting Moz Pro campaign i experienced Sudden Huge Traffic Drop and My site ranki Drops in my site..Please I need your help
Two Day After Starting Moz Pro campaign i experienced Sudden Huge Traffic Drop and Drops in my site..Please I need your help
On-Page Optimization | | zizutz0 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Problem with getting a site to rank at all
We pushed this Word Press site live about a month ago www.primedraftarchitecture.com. Since then we've been adding regular content, blog posts 3 times a week with social posts on facebook, twitter, G+ and LinkedIn. We also submitted via Moz Local about 3 weeks ago. Yext about two weeks ago and have been adding about 5 listings to small local directories a week. Webmaster tools shows that the site map is valid and the pages of the site are getting indexed and it shows links from 7 sites, mostly directories. I'm just not seeing the site ranking for anything. We're getting zero organic traffic. I though we did a good job not over optimizing the pages. I'm just stymied trying to figure out what's wrong. Usually we push a site live and see at least some low rankings after just a couple of weeks. Can anyone see anything that looks bad or where we've gone wrong?
On-Page Optimization | | DonaldS0 -
[HELP!] File Name and ALT Tags
Hi, please answer my questions: 1. Is it okay to use the same keyword on both file name and alt tags when inserting an image? Example: File Name: buy-lego-online.jpg ALT tag: buy-lego-online Will it trigger Google Panda? Will I be penalized for that? Or the file name and alt tags should be different from each other? Because when inserting an image on Wordpress, the alt tags are always the same as the file name by default. 2. For example, I have 2 images in a page (same topic/niche) and I will put "cheap-lego-for-kids" and "best-lego-for-sale" as alt tags. Considering that I repeat the word "lego", is it considered keyword stuffing? Will I be penalized for that? Thanks in advance!
On-Page Optimization | | bubblymaiko0 -
Moving Site from HTTP to HTTPS
Hi, So the news is that Google has started giving more importance to sites with HTTPS i.e. it is now a new ranking signal. It says that as of now it affects fewer than 1% of global queries, and carrying less weight than other signals such as high quality content but it may decide to strengthen it as they would like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web. In that case, what should we do? Switching from http:// to https:// means change in urls and low traffic. How to cope with it? Do we have to implement 'n' number of redirects? Regards,
On-Page Optimization | | IM_Learner3 -
Disavow Tool Submitting 2nd File
Hi About 2 Months ago I submitted a Disavow File using the Disavow Tool I have collected more links and I am ready to upload a 2nd File, However should I download the previous file in Webmaster Tools (Disavow Tool) and add these new links to that File or if I just upload and override the existing file with this file containing new links only will that be ok. What I dont want to do is do something that removes all the previous findings from the list so that google cannot see them anymore. I guess what I am trying to say is does Google just refer to the live file I am updating / overriding or once I have submitted a file weather i remove it or not will google still have a record of it and be referring to it ? Thanks Adam
On-Page Optimization | | AMG1000 -
Duplicate Content Issue in Magento
Hi I need help in resolving the duplicate content issue on my magento site I got a product My main product url is https://www.oakfurnitureking.co.uk/shop-by-product/boston-solid-oak-4-drawer-chest and it got variation of url see below that are causing duplicate content issue , I have inserted the canonical tag on the below url and my main url is https://www.oakfurnitureking.co.uk/shop-by-product/boston-solid-oak-4-drawer-chest but still moz is showing it as duplicate content. Help Please <colgroup><col width="1003"></colgroup>
On-Page Optimization | | Adnan.Hassan.Khan
| https://www.oakfurnitureking.co.uk/product/oak-bedroom-furniture/boston-solid-oak-4-drawer-chest |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/6/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/17/ |
| https://www.oakfurnitureking.co.uk/shop-by-range/boston/boston-solid-oak-4-drawer-chest |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/42/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/63/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/67/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/46/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/79/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/88/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/75/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/90/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/92/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/33/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/27/ |
| https://www.oakfurnitureking.co.uk/shop-by-range/boston-solid-oak-4-drawer-chest |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/50/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/22/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/74/ |0 -
Canonical issue
Hi, Very new to seomoz but very impressed. First report has shown me that I have duplicate pages. Some seem to be duplicate titles and some were duplicates of pages i found on the server. however the main problem is it seems to be picking up pages with www and without it which I have a vague idea is a canonical issue. so it throws up pages like this: http://web-writer-articles.co.uk and http://www.web-writer-articles.co.uk I want it just to pick up pages with www Firstly should it be picking up both and if not how can I make amendments so that it is only picking up pages which include www ? thank you for your help, louandel15
On-Page Optimization | | louandel150