So I did get a Crawl Error notification stating that Robots.txt was inaccessible and that the crawl would be postponed.
Looking under "Blocked URLs" in Webmaster Tools, I see listed that Robots.txt was successfully accessed 16 hours ago with no URLs blocked, yielding the following content:
I started investigating this when I realised that my recent posts were not being indexed. I assumed they were not being indexed because the crawl was being postponed.
Is this accurate? If so, why would Webmaster Tools show a proper Robots.txt? It seems in conflict with the Crawl Error.
Thanks guys. For your reference, my website is http://www.evolvewithevo.com.