I just recently noticed one thing with wordpress (using the latest 2.9.2, also tried on lower versions, standard wordpress and wordpress MU.) .
I'm currently building a site with only pages (no posts). When visiting the /robots.txt URL on my blog I receive a 404 page not found error instead of a virtual robots.txt file. I could not understand why some blogs I've built had a virtual robots.txt and some had not. Because I'm using wordpress mu I rather create a global mu plugin that writes lines to all blogs virtual robots.txt file automatically when called. This enabled me to add functions written to the virtual robots.txt file.
When I later added a public post to the blog with only pages wordress suddenly decided to activate the virtual robots.txt file. If I delete the posts the virtual robots.txt disappears again. If I put the post/posts in private then the virtual robots.txt file is only available to me as admin.
This must clearly be a bug in wordpress that still no one has resolved. I find it weird that no one has reported this issue yet.
Currently the only way around this issue, that I know of, would be to manually add a robots.txt file in your root folder, install a plugin called KB Robots txt or add one post to your blog.
Hope this helps you m8,