Support » Plugin: WordPress HTTPS (SSL) » Very slow on some sites. Multisite? Large number of URLs to parse?

  • I’ve had a few sites on my network that have been very slow, and I’ve determined that deactivating WordPress HTTPS makes them fast again. I’ve done some debugging and here’s what I’ve found:

    * It’s generating several hundred queries that look like “SELECT domain FROM wp_blogs” — to me this looks like a query with a few pieces missing.

    * It doesn’t appear to relate directly to the number of sites on the network. I’m seeing similar behavior on a network with ~40 sites and one with ~400 sites.

    * It DOES appear to correlate with the number of URLs on the page.

    * It impacts all pages on a site, not just those which are set to force SSL.

    * It slows the page build time down from a fraction of a second to anywhere from 5 to 10 seconds.

    Anyone else experienced similar problems? Any suggestions for how to fix it?

    My best guess: getLocalDomains is being run hundreds of times, executing this line:
    $multisite_hosts = $wpdb->get_col($wpdb->prepare("SELECT domain FROM " . $wpdb->blogs, NULL));

    Is the plugin checking every URL on the page? If so, could that query be done once and stored as a private variable in the class?

    Thanks for any insight!

Viewing 5 replies - 1 through 5 (of 5 total)
  • I’ve spent a few more minutes looking at the plugin. I’ve modified the class WordPressHTTPS in lib/WordPressHTTPS.php in the following ways and seen significant performance gains:

    1. declare a protected property $multisite_hosts insude the class
    2. Modify getLocalDomains() something like the following:
    public function getLocalDomains() {
    	global $wpdb;
    	$hosts = array(
    	if ( is_multisite() && is_subdomain_install() ) {
    		if ( !isset($this->multisite_hosts) ) {
    			$this->multisite_hosts = $wpdb->get_col($wpdb->prepare("SELECT domain FROM " . $wpdb->blogs, NULL));
    		$hosts = array_merge($hosts, $this->multisite_hosts);
    	return $hosts;

    Having done that, I shaved a few hundred seemingly-unnecessary queries and about 5 seconds from my average page load. There’s almost certainly an even better way to do this, but that’s what I came up with at a glance.

    I didn’t test your patch but i noticed the same issue, i wonder why it’s not fixed yet..

    Code contribution can be done over at and I noticed a pull request regarding multisite at but it’s not merged (yet?) …

    The plugin author went to merge my pull request, but there were merge conflicts. I redid my commits and created a new pull request. It should be merged in soon.

    @jonathan – does your pull request indeed cover this threads topic of inefficient domain parsing?

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘Very slow on some sites. Multisite? Large number of URLs to parse?’ is closed to new replies.