This thread is related http://wordpress.org/support/topic/found-malicious-user-agent
Robots: You have a VIRTUAL robots.txt created by WordPress when requested, which is why you can't find it server side.
The virtual robots.txt is very basic however - here's the content
I always upload a custom physical robots.txt file for WordPress to the root folder. Here's a basic guide to upload the file to your site. There's also a basic robots.txt file for WordPress you can download and use at that link
Robots.txt is there to tell web spiders what they should not crawl / what they may crawl. Bots that don't read this file before crawling the site, or crawl disallowed files and folders are known as rule breakers.
"Should I be doing something to clean up my site"
OSE Firewall is stopping external requests. Looks like it's doing it's job :)
Requests for odd looking files you don't have may be hacker bots or hackers looking for vulnerabilities - If those files don't exist, your site is probably OK. Or could be backlinks with typing errors...
If in doubt, use a malware/vulnerability scan plugin to check, or get a security specialist to examine the site.