A page that is disallowed in the robots.txt file will still show up as a URL-only entry in the SERPs because they know that a page exists at that URL from the links that point to it.
If you want the page to not appear at all, then you need the <meta name="robots" content="noindex">
tag on the page instead.
Google tips:
http://www.google.com/support/webmasters/bin/topic.py?topic=8843
The only way google will spider your site is if it finds a link to it from somewhere else.
Make sure you turn off the “pinging” for now – as that will be the first place you’ll get links from.
If your sitename (domain) is brand new – and no one has every had it before – that should be enough. Just make sure you don’t go telling anyone about the site, or asking for links from others yet.
Or you can install xampp locally and complete design of your wp site on your own machine…. prior to uploading online. Since you will be able to create a viable database with xampp you will also be able to write test posts etc.
You can also block everyone but you from seeing the page using htaccess rule:
order deny,allow
allow from yourIPAddress
deny from all