Questions created by AaronWintersCSUS
-
Disallow doesnt disallow anything!?
Been trying to exclude a ton of backed up files on our domain that have to stay public as people transition content over. I have tried everything — search a subdirectory by name, updated robots.txt with disallow and noindex (both with and without / or /*) and I still get almost triple the number of 'actual' pages. I there any way to get cleaner results aside from manual sort-n-cut from the CSV?
Getting Started | | AaronWintersCSUS1