If you (like me) have a custom robots.txt the sitemap exclusions don't work (I'm sure I am not telling you anything you didnt already know)
If I temporarily rename my custom robots.txt and let WordPress generate robots.txt, I am able to cut / paste the urls to my custom robots.txt.
I wish there was a way to automate / streamline this?
I see there is a function that can be used to generate the robots.txt http://codex.wordpress.org/Function_Reference/do_robots however I currently do not have a clue how to utilize it practically i.e. in a (private) page (?)
That said a feature suggestion for SiteTree might be to somehow have a tab or button so you could see the urls to be added to the robots.txt. But if none of it will work if a custom robots.txt is "live" then maybe it is a moot point.
Thx again for SiteTree. I am finding it very useful :)