Disallow: /sr/ and Disallow: /si/ - robots.txt
-
Hello Mozzers - I have come across the two directives above in a robots.txt file of a website - the web dev isn't sure what they meant although he implemented robots.txt - I think just legacy stuff that nobody has analysed for years - I vaguely recall sr means search request but can't remember.
If any of you know what these directives do, then please let me know.
-
Thanks Tomas and Mike - good advice - I have done that and found legacy stuff they've since moved away from - there is indeed no current use for the directives.
I wonder whether there's any resource on the web that lists all robots.txt directives - and interprets them - if not then perhaps it would an idea for Moz?
-
Have a look at your site through http://web.archive.org/. You'll be able to see what the directories were used for.
However, if there's no use for them on the current site then what's the purpose of keeping these disallows in the robots.txt?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doing SEO for single page applications / Prerender.io
My dev and I are migrating an existing multi page application to a single page application with prerender.io. Does anybody have any experience with doing SEO for single page applications? Any other consequences we should take into account? Anything important to expect. Any insights would be 10/10 appreciated.
Web Design | | Edward_Sturm0 -
Privacy Policy: index it/? And where to place it?
Hi Everyone, Two questions, first: should you allow google to index your privacy policy? Second: for a service based site (not e-commerce, not selling anything) should you put the policy in the footer so it's site wide or just on the "contact us" form page? Best, Ruben
Web Design | | KempRugeLawGroup0 -
Managing website content/keywords for wordpress site
We are in the midst of redesigning our website and have been working with freelance blog/content writers to increase the unique content on our site. We are finding it increasingly difficult to manage the topics/keywords as we continue to expand. Googledrive and google spreadsheets have been our primary tools thus far. Can anyone recommend a good tool that would allow us to manage content and blog posts for our site?
Web Design | | Tom_Carc0 -
Web development - License CSS/Markup/Code
In development of a website, is it typical for the developer to retain rights to the CSS, Markup and other Coding? If so, why is this done?
Web Design | | DemiGR0 -
Comparing the site structure/design of my live site to my new design
Hi SEOmoz team, for the last few months I've been working on a new design for my website, the old, live design can be viewed at http://www.concerthotels.com - it is primarily focused on helping users find hotels close to concert venues throughout North America. The old structure was built in such a way that each concert venue had a number of different pages associated with it (all connected via tabs) - a page with information about the venue, a page with nearby hotels to the venue, a page of upcoming events, a page of venue reviews. An example of these pages can be seen at: http://www.concerthotels.com/venue/madison-square-garden/304484 http://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 http://www.concerthotels.com/venue-events/madison-square-garden-events/304484 http://www.concerthotels.com/venue-reviews/madison-square-garden-reviews/304484 The /venue-hotels/ pages are the most important pages on my website - and there is one of these pages for each concert venue - they are the landing pages for about 90% of the traffic on the website. I decided that having four pages for each venue was probably a poor design, since many of the pages ended up having little or no useful, unique content. So my new design attempts to bring a lot of the venue information together into fewer pages. My new website redesign is temporarily situated at: (not currently launched to the public) http://www.concerthotels.com/frontend The equivalent pages for Madison Square Garden are now: http://www.concerthotels.com/frontend/venue/madison-square-garden/304484 (the page above contains venue information, events and reviews) and http://www.concerthotels.com/frontend/venue-hotels/madison-square-garden-hotels/304484 I would really appreciate any feedback from you guys, based on what you think of the new site design compared to the old design from an SEO point of view. Of course, any feedback on site speed, easy of use etc compared to the old design would also be greatly appreciated. 🙂 My main fear is that when I launch the new design (the new URLs will be identical to the old ones), Google will take a dislike to it - I currently receive a large percentage of my traffic through Google organic search, so I don't want to launch a design that might damage that traffic. My gut instinct tells me that Google should prefer the new design - vastly reduced number of pages, each page now contains more unique content, and it's very much designed for users, so I'm hoping bounce rate, conversion etc will improve too. But my gut has been wrong in the past! 🙂 But I'd love to hear your thoughts, and thanks in advance for any feedback, Cheers Mike
Web Design | | mjk260 -
How to verify http://bizdetox.com for google webmaster tools
Hey guys i tried to to make a Preferred Domain choice in webmaster tools, but it is not allowing me to save my choice bec its asking me to verify that i own http://bizdetox.com How do i go about doing that and what are the steps I have already verified www.bizdetox.com
Web Design | | BizDetox0 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0 -
Hosting/design company that is both cheap and has a nice partner package. Any ideas?
I need to signon with a hosting/design company that is both cheap and has a nice partner package. Any ideas?
Web Design | | christinarule0