sorry this is probably in the wrong forum but I know several search engines use spider software to go through directories in your site and searches the content (I think google does this)
I have pages from old sites on my site and rather than delete them I’d rather have them not be searched. I understand theres something about adding a text file to a directory and calling it ROBOTS.txt or something
can anyone tell me more, like I am way off the mark, is this true, what should be in the file, etc.
thanks :thumb: