Like Daniel said you can use robots.txt to block spiders, but this won't guarantee exclusion of URLs showing up in search results. You could use x-robots-tag in the server headers. Generate a 403 every time user-agent hits the sub domain.
- Home
- roimediaworks
roimediaworks
@roimediaworks
Job Title: CEO & Founder
Company: ROI Media Works
Website Description
Canadian payroll services company offering a flat fee services to busy business owners.
Favorite Thing about SEO
Measuring Return On Investment (ROI) For Clients Campaigns
Latest posts made by roimediaworks
-
RE: How do you block development servers with robots.txt?
Best posts made by roimediaworks
-
RE: How do you block development servers with robots.txt?
Like Daniel said you can use robots.txt to block spiders, but this won't guarantee exclusion of URLs showing up in search results. You could use x-robots-tag in the server headers. Generate a 403 every time user-agent hits the sub domain.
I have always been drawn to disparate things: places, people and ideas. Growing up in India and then moving to London —experiencing different languages and cultures—has serendipitously prepared me to navigate and flourish in today’s digital world.
Looks like your connection to Moz was lost, please wait while we try to reconnect.