I am trying to block robots from indexing parts of my site..
-
I have a few websites that I mocked up for clients to check out my work and get a feel for the style I produce but I don't want them indexed as they have lore ipsum place holder text and not really optimized... I am in the process of optimizing them but for the time being I would like to block them. Most of my warnings and errors on my seomoz dashboard are from these sites and I was going to upload the folioing to the robot.txt file but I want to make sure this is correct:
User-agent: *
Disallow: /salondemo/
Disallow: /salondemo3/
Disallow: /cafedemo/
Disallow: /portfolio1/
Disallow: /portfolio2/
Disallow: /portfolio3/
Disallow: /salondemo2/
is this all i need to do?
Thanks
Donny
-
Thanks
-
this is the correct approach when using the robots.txt method of blocking. Be aware however, that the only secure way to ensure 100% that such locations are not indexed is to put them behind a password protected gate-way. I always recommend to design agencies that there be a simple single log-in screen between the front end and design folders. This can be as complex as unique UIDs and Passwords for every client, or a single shared login, if all you want to do is bar search engines from seeing the content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
How to setup an iFrame to be indexed as the parent site
Hi, we are trying to move all of our website content from www.mysite.com to a subdomain (i.e. content.mysite.com), and make "www.mysite.com" nothing more than an iFrame displaying the content from content.mysite.com. We have about 10 pages linking from the home page, all indexed separately, so I understand we'll have to do this for every one of them. (www.mysite.com/contact will be an iframe containing the content from content.mysite.com/contact, and we'll need to do this for every page) How do we do this so Google continues to index the content hosted at content.mysite.com with the parent page in organic results (www.mysite.com). We want all users to enter the site through www.mysite.com or www.mysite.com/xxxxxx, which will contain no content except for iFrames pulling in content from content.mysite.com. Our fear is that google will start directing users directly to content.mysite.com, rather than continue feeding to www.mysite.com. If we use www1.mysite.com or www2.mysite.com as the location of the content, instead of say content.mysite.com, would these subdomain names work better for passing credit for the iFramed content to the parent page (www.mysite.com)? Thanks! SIDE NOTE: Before someone asks why we need to do this, the content on mysite.com ranks very well, but site has a huge bounce rate due to a poorly designed CMS serving the content. The CMS does not load the page in pieces (like most pages load), but instead presents the visitor with a 100% blank page while the page loads in the background for about 5-10 seconds, and then boom 100% of the page shows up. We've been back and forth with our CMS provider about doing something about this for 5 years now, and we have given up. We tested moving our adwords links to xyz.mysite.com, where users are immediately shown a loading indicator, with our site (www.mysite.com) behind it in an iFrame. The immediate result was resounding success... our bounce rate PLUMMETED, and the root domain www.mysite.com saw a huge boost in search results. Problem with this is our site still comes up in organic results as www.mysite.com, which does not have any kind of spinning disk loading indicator, and still has a very high bounce rate.
Technical SEO | | vezaus0 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
"Extremely high number of URLs" warning for robots.txt blocked pages
I have a section of my site that is exclusively for tracking redirects for paid ads. All URLs under this path do a 302 redirect through our ad tracking system: http://www.mysite.com/trackingredirect/blue-widgets?ad_id=1234567 --302--> http://www.mysite.com/blue-widgets This path of the site is blocked by our robots.txt, and none of the pages show up for a site: search. User-agent: * Disallow: /trackingredirect However, I keep receiving messages in Google Webmaster Tools about an "extremely high number of URLs", and the URLs listed are in my redirect directory, which is ostensibly not indexed. If not by robots.txt, how can I keep Googlebot from wasting crawl time on these millions of /trackingredirect/ links?
Technical SEO | | EhrenReilly0 -
Page Indexing increase when I request Google Site Link demote
Hi there, Has anyone seen a page crawling increase in Google Web Master Tools when they have requested a site link demotion? I did this around the 23rd of March, the next day I started to see page crawling rise and rise and report a very visible spike in activity and to this day is still relatively high. From memory I have asked about this in SEOMOZ Q&A a couple of years ago in and was told that page crawl activity is a good thing - ok fine, no argument. However at the nearly in the same period I have noticed that my primary keyword rank for my home page has dropped away to something in the region of 4th page on Google US and since March has stayed there. However the exact same query in Google UK (Using SEOMOZ Rank Checker for this) has remained the same position (around 11th) - it has barely moved. I decided to request an undemote on GWT for this page link and the page crawl started to drop but not to the level before March 23rd. However the rank situation for this keyword term has not changed, the content on our website has not changed but something has come adrift with our US ranks. Using Open Site Explorer not one competitor listed has a higher domain authority than our site, page authority, domain links you name it but they sit there in first page. Sorry the above is a little bit of frustration, this question is not impulsive I have sat for weeks analyzing causes and effects but cannot see why this disparity is happening between the 2 country ranks when it has never happened for this length of time before. Ironically we are still number one in the United States for a keyword phrase which I moved away from over a month ago and do not refer to this phrase at all on our index page!! Bizarre. Granted, site link demotion may have no correlation to the KW ranking impact but looking at activities carried out on the site and timing of the page crawling. This is the only sizable factor I can identify that could be the cause. Oh! and the SEOMOZ 'On-Page Optimization Tool' reports that the home page gets an 'A' for this KW term. I have however this week commented out the canonical tag for the moment in the index page header to see if this has any effect. Why? Because as this was another (if not minor) change I employed to get the site to an 'A' credit with the tool. Any ideas, help appreciated as to what could be causing the rank differences. One final note the North American ranks initially were high, circa 11-12th but then consequently dropped away to 4th page but not the UK rankings, they witnessed no impact. Sorry one final thing, the rank in the US is my statistical outlier, using Google Analytics I have an average rank position of about 3 across all countries where our company appears for this term. Include the US and it pushes the average to 8/9th. Thanks David
Technical SEO | | David-E-Carey0 -
Mobile site - allow robot traffic
Hi, If a user comes to our site from a mobile device, we redirect to our mobile site. That is www.mysite/mypage redirects to m.mysite/mypage. Right now we are blocking robots from crawling our m. site. Previously there were concerns the m. site could rank for normal browser searches. To make sure this isn't a problem we are planning on rel canonical our m. site pages and reference the www pages (mobile is just a different version of our www site). From my understanding having a mobile version of a page is a ranking factor for mobile searches so allowing robots is a good thing. Before doing so, I wanted to see if anyone had any other suggestions/feedback (looking for potential pitfalls, issues etc)
Technical SEO | | NicB10 -
Removing a site from Google's index
We have a site we'd like to have pulled from Google's index. Back in late June, we disallowed robot access to the site through the robots.txt file and added a robots meta tag with "no index,no follow" commands. The expectation was that Google would eventually crawl the site and remove it from the index in response to those tags. The problem is that Google hasn't come back to crawl the site since late May. Is there a way to speed up this process and communicate to Google that we want the entire site out of the index, or do we just have to wait until it's eventually crawled again?
Technical SEO | | issuebasedmedia0 -
Problem with my site
the site is casino.pt we created the site 7-8 month ago, we started to push it by good and natural links (http://www.opensiteexplorer.org/www.casino.pt/a!links!!filter!all!!source!external!!target!page), links in sites with content rich and most of them related to gambling and sport topics. During the first 3-5 months, the rankings were better and better, after the 6 months, the site lose all its rankings. Aditional details http://www.casino.pt/robots.txt http://www.google.pt/#hl=pt-PT&source=hp&biw=1280&bih=805&q=site:http%3A%2F%2Fwww.casino.pt&aq=f&aqi=&aql=&oq=&fp=2651649a33cd228 no critical errors in google webmaster tools any idea how can I fix it? thanks
Technical SEO | | Yaron530