What is robots.txt file issue?
-
I hope you are well. Mostly moz send me a notification that your website can,t be crawled and it says me o check robots.txt file. Now the Question is how can solve this problem and what should I write in robots.txt file?
Here is my website. https://www.myqurantutor.com/
need your help brohers.... and Thanks in advance
-
Not sure. Your robots.txt file looks fine & shouldn't be blocking anything except for admin:
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php sitemap: https://www.myqurantutor.com/sitemap_index.xml
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issue to index AMP pages
Hello In googe search console i have about 500 pages indexed for my websites https://horaire-priere.be/ and https://horarios-oracion.es/ but only 20 pages are indexed as AMP. Butmy site is amp only. I can't understand why google indexed the page but not in amp mode? Thank you in advanced
On-Page Optimization | | Zakirou0 -
Correct robots.txt for WordPress
Hi. So I recently launched a website on WordPress (1 main page and 5 internal pages). The main page got indexed right off the bat, while other pages seem to be blocked by robots.txt. Would you please look at my robots file and tell me what‘s wrong? I wanted to block the contact page, plugin elements, users’ comments (I got a discussion space on every page of my website) and website search section (to prevent duplicate pages from appearing in google search results). Looks like one of the lines is blocking every page after ”/“ from indexing, even though everything seems right. Thank you so much. FzSQkqB.jpg
On-Page Optimization | | AslanBarselinov1 -
.htaccess file uploaded, website won't load
I uploaded the .htaccess file with the below, and now my website won't load at all? Then I deleted the htaccess file and it still won't load? But then it would load on my phone when I took it down, not on chrome, or explorer? Then I put it back up and looked again on my phone, wouldn't load on phone. Then deleted file and it still won't load on my phone? What is going on? RewriteEngine on
On-Page Optimization | | dwebb007
RewriteCond %{HTTP_HOST} !^http://freightetc.com$
RewriteRule ^(.)$ http://www.freightetc.com/$1 [R=301]
RewriteCond %{THE_REQUEST} ^./index.php
RewriteRule ^(.*)index.php$ http://www.freightetc.com/$1 [R=301]0 -
Is it better to put all your CSS in 1 file or is it no problem to use 10 files or more like on most frameworks?
Is it better to put all your CSS in 1 file or is it no problem to use 10 files or more like on most frameworks?
On-Page Optimization | | conversal0 -
Product page reviews issues
Hi, We've implemented pagination on the reviews with rel=next/prev, but have seen no improvements since this. An example page with reviews is here. Can you see any issues on this that would be causing the problem? Thanks!
On-Page Optimization | | pikka0 -
Tags creating duplicated content issue?
Hello i believe a lot of us use tags in our blogs as a way to categorize content and make it easy searchable but this usually (at lease in my case) cause duplicate content creation. For example, if one article has 2 tags like "SEO" & "Marketing", then this article will be visible and listed in 2 urls inside the blog like this domain.com/blog/seo and domain.com/blog/marketing In case of a blog with 300+ posts and dozens of different tags this is creating a huge issue. My question is 1. Is this really bad? 2. If yes how to fix it without removing tags?
On-Page Optimization | | Lakiscy0 -
Photogallery and Robots.txt
Hey everyone SEOMOZ is telling us that there are to many onpage links on the following page: http://www.surfcampinportugal.com/photos-of-the-camp/ Should we stop it from being indexed via Robots.txt? best regards and thanks in advance... Simon
On-Page Optimization | | Rapturecamps0 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30