Hi,
The new version is safe to use, it is being used by thousands of blogs. Since there are so many different configurations of Webservers, other Plugins and so on it is normal that some of these have some issues. The plugin can’t be tested on all configurations before the release.
Fell free to upgrade. If you have any problems, it is no problem to downgrade to 3.4.1 again.
Thank you very much. It seems to have worked. In my blog directory I now have sitemap.backup.xml and sitemap.backup.xml.gz – I take it, it’s correct.
But I have another question regarding the robots.txt file. In you settings the option for “Add sitemap URL to the virtual robots.txt file. The virtual robots.txt generated by WordPress is used. A real robots.txt file must NOT exist in the blog directory!” is ticked automatically. However, I have my own robots.txt file on the server where I have:
Sitemap: http://mydomainname.com/sitemap.xml
Do I untick your option or keep it checked?
Thank you.
Another thing I would like to add to my previous reply: I checked the index status in Webmaster Tools and all my previous indexed pages have disappeared, Google is indexing my site from scratch. Is this how it is supposed to be? Also, how long will it take before all my URLs are indexed?