Hi there 🙂
This is a bug that’ll be resolved in v3.1, where it’ll verify the protocol beforehand.
Now, I recommend not altering the plugin’s code, because it’ll be overwritten by the next update. So, feel free to reach out to me more often when you run into problems.
But instead of waiting for v3.1, you can resolve it already by switching an option.
On the SEO Settings page, go to the General Settings meta box. Underneath the Canonical tab, you’ll find Scheme settings. Set this to “HTTPS”.
When that’s set, the plugin will only generate canonical URLs as HTTPS.
I hope this helps 🙂
Thank you so much Sybre. I wasnt expecting a reply so soon! This reinforces my belief in the plugin. Great work!
Also, on a unrelated topic, is there a way to add additional lines to the robots.txt file? I would like to add a couple of noindexes to the file. If need be, I will open up another support thread.
Cheers 🙂
Yes, unrelated issues are best left for new topics.
In any case, you could either upload your own robots.txt file to the root folder (so you can easily manage it), use a robots.txt generation plugin, or use filters.
add_filter( 'robots_txt', function( $robots = '' ) {
// Add to, don't overwrite.
$robots .= '# My new comment' . PHP_EOL;
$robots .= 'My new rule' . PHP_EOL;
// Always return the output in a filter.
return $robots;
}, 11 ); // TSF overwrites the robots file, and it runs at 10. So, you'll have to "add" at 10.0...1 or later.
I hope this helps!