Needs clarification: How "Disallow: /" works?
-
Hi all,
I need clarification on this. I have noticed we have given "Disallow: /" in one of our sub-directory beside homepage. So, how it going to work now? Will this "Disallow: /" at sub-directory level going to disallow only that directory or entire website?
If it is going to work for entire website; we have already given one more Disallow: / at homepage level blocking few folders. How it is going to handle with two Disallow: / commands?
Thanks
-
Hi vtmoz,
You've received some great responses! Did any of them help answer your question? If so, please mark one or more as a "good answer." And if not, please let us know how we can help. Thanks!
Christy
-
If you have concerns, I strongly recommend using Google Search Console to test URL use cases against your existing robots.txt file and before you do potential edits.
-
The directive that is literally "Disallow: /" will prevent crawling of all pages on your site, since technically, all page paths begin with a slash. Robots.txt files can only live at the root folder (subdirectory) of a site, so if you want to disallow a folder, you'll need to specify that with a directive like "Disallow: /folder-name/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Magento 2.1 Multi Store / SEO
This is quite technical but I'm hoping a Magento expert can clear this up for me. Currently my company has two websites on separate Opencart platforms. What I'm doing now is building a Magento website and using the multi store function as well as a few modules to combine the two sites, the aim being that the link juice is shared and I can focus my SEO efforts on the one site instead of two, thus reducing my workload while maintaining the benefits. This is the intended layout: www.domain.com www.domain.com/us I have created a sub-folder (not a subdomain) as this seems to be the best way to share link juice between the new, combined sites (as well as 301s from the old, redundant site). At the moment I have created 2 separate websites, stores and store views (see attached) and have configured it according to the Magento guide, so I know that technically this is correct but I need to make sure that I have done it correctly in relation to SEO. Is the sub-folder set up correctly for instance? Currently the only files to populate that sub-folder are a htaccess, error log and index.php (see attached). Also, is there anything I could be missing in relation to SEO within the parameters of what I am trying to achieve? Additionally, only one store view appears in the "change store view" section of the home page. This is causing me to question if I have set it up correctly, because I had assumed both store views would appear even if they were under different websites (attached). OR do I simply use the same website and create two stores and store views? Do I also need to create a separate database for each website/store/store view? I would very much appreciate if someone could help out here. Thank you. In1Gi7t pyfM03y nUQoMz1
Web Design | | moon-boots0 -
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. How long could Google take to crawl/index the new pages and rank the keywords used within those pages?
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. The 3 locations old domains were redirected to their sites within our main brand domain. How long could Google take to crawl/index the new pages and rank the keywords used within those pages? And possibly increase our domain authority hopefully? We didn't want our brand spread out over multiple websites/domains on the internet. This also allowed for more content to be written on pages, per each of our locations service's, as well.
Web Design | | BurgSimpson0 -
My news site not showing in "In the news" list on Google Web Search
I got a news website (www.tapscape.com) which is 6 years old and has been on Google News since 2012. However, whenever I publish a news article, it never shows up "In the news" list on Google Web Search. I have already added the schema.org/NewsArticle on the website and have checked it if it's working or not on Google structured data testing tool. I see everything shows on on the structured data testing tool. The site already has a news sitemap (http://www.tapscape.com/news-sitemap.xml) and has been added to Google webmaster tools. News articles show perfectly fine in the News tab, but why isn't the articles being shown on "In the news" list on the Google web search? My site has a strong backlink background already, so I don't think I need to work on the backlinks. Please let me know what I'm doing wrong, and how can I get it to the news articles on "In the news" list. Below is a screenshot that I have attached to this question to help you understand what I mean to say. 1qoArRs
Web Design | | hakhan2010 -
Wordpress themes causing google penalty(need experts to settle a debate)
Hi, I have been having a disagreement with another online marketing company. We are both promoting the same product under a different brand name but we ended up using the same theme to build our WordPress sites off of but in no way is the content the same. They are telling me that using the same theme in the same industry will cause a Google penalty. I do not believe this and do not see this causing a problem. The sites are relatively new so there is no proof of traffic dropping or penalties as of yet. What is everyone's professional opinion on this? Can a WordPress theme cause duplicate content penalty? If so would that not mean that anyone using themes will have some sort of penalty?
Web Design | | impact891 -
Nesting <a>tag for rel="nofollow"</a>
I just wanted to quickly run this past someone. I have some footer links I want set to nofollow: <map <span="" class="webkit-html-tag">name</map>="Map"><area <span="" class="webkit-html-tag">shape="rect" coords="10,39,73,101" href="URL" target="_blank" alt="some text"> should this be <map <span class="webkit-html-tag">name</map <span>="Map"><area <span class="webkit-html-tag">shape</area <span>="rect" coords="10,39,73,101 <a <span class="webkit-html-tag">href="URL" rel="nofollow" target="_blank" alt="some text"></a <span> Want to check before I advise. If this is not the way how can I fix?
Web Design | | MickEdwards0 -
Comparing the site structure/design of my live site to my new design
Hi SEOmoz team, for the last few months I've been working on a new design for my website, the old, live design can be viewed at http://www.concerthotels.com - it is primarily focused on helping users find hotels close to concert venues throughout North America. The old structure was built in such a way that each concert venue had a number of different pages associated with it (all connected via tabs) - a page with information about the venue, a page with nearby hotels to the venue, a page of upcoming events, a page of venue reviews. An example of these pages can be seen at: http://www.concerthotels.com/venue/madison-square-garden/304484 http://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 http://www.concerthotels.com/venue-events/madison-square-garden-events/304484 http://www.concerthotels.com/venue-reviews/madison-square-garden-reviews/304484 The /venue-hotels/ pages are the most important pages on my website - and there is one of these pages for each concert venue - they are the landing pages for about 90% of the traffic on the website. I decided that having four pages for each venue was probably a poor design, since many of the pages ended up having little or no useful, unique content. So my new design attempts to bring a lot of the venue information together into fewer pages. My new website redesign is temporarily situated at: (not currently launched to the public) http://www.concerthotels.com/frontend The equivalent pages for Madison Square Garden are now: http://www.concerthotels.com/frontend/venue/madison-square-garden/304484 (the page above contains venue information, events and reviews) and http://www.concerthotels.com/frontend/venue-hotels/madison-square-garden-hotels/304484 I would really appreciate any feedback from you guys, based on what you think of the new site design compared to the old design from an SEO point of view. Of course, any feedback on site speed, easy of use etc compared to the old design would also be greatly appreciated. 🙂 My main fear is that when I launch the new design (the new URLs will be identical to the old ones), Google will take a dislike to it - I currently receive a large percentage of my traffic through Google organic search, so I don't want to launch a design that might damage that traffic. My gut instinct tells me that Google should prefer the new design - vastly reduced number of pages, each page now contains more unique content, and it's very much designed for users, so I'm hoping bounce rate, conversion etc will improve too. But my gut has been wrong in the past! 🙂 But I'd love to hear your thoughts, and thanks in advance for any feedback, Cheers Mike
Web Design | | mjk260 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0 -
Searching for BEST e-Commerce Multilanguage Platform, Need Advice
I have 2 online store in Canada. Both are selling One bilingual (English&french) Filtration Montreal and a unilinguale store Furnace Filters Canada Both are offering the same products at the same price for Canadian. We work hard to rank on Google.ca because we only sell and ship to Canada. The platform of Filtration Montreal is very basic and limited. For example, the url structure make it very hard to rank on Google.ca This platform is very not SEO friendly with url like: http://www.filtrationmontreal.com/en/product/honeywell-genuine-filter-95/pack-of-5-genuine-honeywell-furnace-filters-20x20x5-601.html The only good thing about this platform, is the multilingual option. The customer can shop in french or English. I would like to move that store to a new platform where I can create a multilingual online store. Do you have sugestions? Furnace Filters Canada is on BigCommerce. I find it SEO friendly. Using SEOmoz tools and new to SEO, high competitive keywords like: furnace filters, furnace filter are ranking on 3rd rank in Google.ca fist page! This site is getting more & more visitors every months. The only frustrating thing is the English only version of the stores to customers. QUESTIONS: What will be the SEO impact if I'm moving Furnace Filters Canada to a new platform? Do you have suggestions in finding the perfect multilanguage e-Commerce platform? Andrew Bleakley suggest Ashop. Anybody using Ashop? How about a eCommerce platform that can manage my 2 stores at the same time. REMEMBER, we sell and ship to Canada only. Thank you for your help and support. BigBlaze
Web Design | | BigBlaze2050