I think a lot of people out there have a similar need so here is some information to help.
First, never expect a robot.txt file to protect your data. Sure, Google and other friendly web crawlers may stay away but nefarious individuals who are out there to do harm do not care about a robot.txt file. That is something a program CHOOSES to respect, so it's kind of like saying, "Cover your eyes and don't peek while I change my cloths.", it doesn't really stop the person. As you point out, the user can still download if they know the path no matter what the robot.txt says illustrating that it isn't security but a suggestion.
For testing purposes I wrote a module that uploads a file to my site. After uploading the file it streams a copy of what the user just uploaded to a file that I save on my local machine. This is intended to test both the ability to save (upload) and retrieve(download) data written to a file. My local workstation is a Windows 7 machine running IIS 7.
The best way to handle this is with the proper permissions on the folder, however this is not as straight forward as one may expect. Setting permissions on a folder is best because it gives you the flexibility to use any directory you want. To accomplish this I created a directory in my website called "UploadedFiles" and then modified the folder permissions in the folder properties. There is a user named "Users (COMPUTERNAME/Users)". For this user I selected the option to Deny permission to "Read & execute", "List folder contents", and "Read". After doing this I was able to upload a file to the directory using my module and download the file using my streaming download after uploading it. I was not able to access the file using a URL that points to the file. When I attempted this, I got a permission denied error in the browser which is exactly what I wanted. The specific user that needs the permission change might be different on each machine, but one of them should do it.
The one downside I experienced with this setup is that when I went to verify that the file upload was in the proper location I was denied access through the Windows Explorer. To verify that the file made it, I had to back out the permissions. This is a little annoying during testing but once it goes live it wouldn't be a problem because no human is going to be browsing the directory structure. I don't think this will happen on a server because the user accounts are configured a little different with a specific account for IIS, but I haven't experimented that far yet.
A second option is to store the file contents to a database. YOU SHOULD NOT DO THIS. I only mention it here because it works and you may come across the suggestion elsewhere. The slowness of the I/O is a downside to using this method. It is a big task to save and retrieve these bigger chunks of data from the database. Since the files don't actually have a physical path the database method protects them as you have to explicitly do the file retrieval from the database record.
A third option, if you can't figure out how to set up a protected directory by adjusting folder permissions, is to use the "App_Data" folder to store the files. By default this folder should not be accessible to your users so anything you put in it will be safe. It was created to hold the database file but can be used for other needs. I created a folder, "/App_Data/MyModuleData" and was able to save a file to it using my file upload module then stream the contents to a file download. I was not able to access the file using a URL in my browser.
All of these methods of protecting the file require special handling to facilitate file downloads. Files will need to be programatically accessed and streamed to any user who needs them.
Hope this helps anyone who finds it. Good luck people!