Support » Networking WordPress » suicidal multisite database queries

  • It has come to my attention that querying the multisite database to get information from all its sites is not a good thing if you have a lot of sites.

    Plugins that do things such as list all the sites in a multisite, or publish all recent comments, rely on the db-querying function get_blog_list, which has been deprecated for its “suicidal” tendencies.

    I have read about the transients api, but don’t understand how to use it, and apparently there is some question as to whether that even would work in this case.

    I need more information about the advisability of and alternatives to querying the database if you have a large number of sites. Please kick this over to the WP-Advanced forum if necessary.

Viewing 15 replies - 1 through 15 (of 18 total)
  • Ron Rennick


    MultiSite Guru

    “Plugins that do things such as list all the sites in a multisite, or publish all recent comments, rely on the db-querying function get_blog_list, which has been deprecated for its “suicidal” tendencies.”

    Try sticking the word Some in front of that sentence:

    Some plugins that do things such as…

    Write your plugin so it updates a global table, transient cache, etc. as appropriate whenever relevant data in the network changes. WordPress has hooks on virtually every database change that is made on individual sites.

    ok… let’s take this from the top. I need help caching — if that is the best solution — a list of, let’s say, 1000 blogs. What do I need to wrap this in? Specifically, what is the syntax?

    // creates an alphabetical multisite list by blog title
    function mymultilist() {
    	global $wpdb;
    	$query = "SELECT blog_id FROM " . $wpdb->base_prefix . "blogs WHERE spam != '1' AND archived != '1' AND deleted != '1' AND public = '1'";
    	$allsites = $wpdb->get_results($query);
    	$sitelist = array();
    	foreach ($allsites as $thissite) {
    		$alldetails = get_blog_details($thissite->blog_id);
    		// here is everything contained in the blog details: blog_id, site_id, domain, path, registered, last_updated, public, archived, mature, spam, deleted, lang_id, blogname, siteurl, post_count
    		$thesedetails = array(
    		// but we only need these two for our current purposes
    		array_push($sitelist, $thesedetails);
    	foreach ($sitelist as $key=>$row) {
    		$thispath[$key] = $row['thispath'];
    		$thisname[$key] = strtolower($row['thisname']);
    	array_multisort($thisname, SORT_ASC, $sitelist);
    	foreach ($sitelist as $key=>$value )  {
    		echo '<li><a href="' . $sitelist[$key][thispath] . '">' . $sitelist[$key][thisname] . '</a></li>';
    Ron Rennick


    MultiSite Guru

    First, you need to decide whether a cache or global table is appropriate depending on the amount of data you are going to be storing.

    If there were 1000 sites/blogs, I would not use any of that code. Look at the get_blog_details function and see where the data is being retrieved from.

    Do you mean the wp_cache_get? Clearly I am in over my head…. Does this further the cause at all?

    $allsites = wp_cache_get('cachedresults');
    if (false == $allsites) {
    	$allsites = $wpdb->get_results($query);
    	wp_cache_set('cachedresults', $allsites);

    Sorry to be so remedial here, but starting at the very beginning…

    1. When the dude who deprecated get_blog_list called the function “suicidal,” what did he mean? That it would choke a web page? make the server cranky? hose the database? cause the world to end before 2012? What are the implications of using it?

    2. If I were the kind of person who does not lose sleep over using deprecated functions, is get_blog_list perfectly acceptable for listing, say, up to 100 sites?

    3. If I were the kind of person who does lose sleep over using deprecated functions, is writing my own function to mimic get_blog_list perfectly acceptable for listing up to 100 sites? Indeed, is it the most efficient way?

    4. Or should we all be caching, making global tables, whatever, for anything from a multisite list to multisite recent posts? Is that the most efficient way, or is it overkill for most multisites?

    2 – yes, fine for a smaller list.

    4 – overkill for most.

    Thanks! So I can see that this caching/global table thing is going to get hairy and it’s probably not something I will be able to achieve. But as a starting point, referencing Ron R above…

    First, you need to decide whether a cache or global table is appropriate depending on the amount of data you are going to be storing.

    …what approach would you take if you wanted an alphabetical (by blog title) list of 1000 blogs, and only that (not posts or comments)?

    Caching with transients is easy. Just get the transient at the beginning of your function, and if it’s not false, return the value of it.

    If it is false, then do your processing, stick it in the transient, and return the data.


    function whatever() {
    $value = get_transient('my_transient');
    if ($value !== false) return $value;
    // code here to do your long calculations
    // result should be put in $value
    set_transient('my_transient',$value, 24*60*60) ; // 24 hour cache
    return $value;

    Basically, you will only be doing your heavy calculation once every 24 hours that way.

    The wp_cache stuff is not the way to go because it’s not (necessarily) persistent across hits. For caching across multiple page views, use a transient.

    Cool, thanks so much. If I wanted to do my heavy calculation when a new blog is created (rather than every 24 hours) I would do this?

    function wpmu_new_blog_delete_transient() {
    add_action('wpmu_new_blog', 'wpmu_new_blog_delete_transient');

    Guess I would also need one for delete_blog and newblogname (if that is what is used when the user changes their blog title).

    (Though the blogs get added occasionally, deleted seldomly, and undergo a title change indeterminately. That’s not very transient, is it?)

    Transients expire at the time given (after which point they return false) or you can manually delete them.

    If you want to do something that is event-driven instead of time-driven, then you shouldn’t use a transient at all. Use a normal option instead (set_option, get_option, etc). Transients are basically options with time limits.

    That is what was dawning on me…

    So anyway, between my meager php skills and non-existent database knowledge I am at a loss as to how to solve this problem. My code above is apparently not up to snuff, but I don’t know why and — even if I did — how I would make it better (other than caching its result).

    So I’m giving up for the time being. If anyone would like to take on this challenge, here it is:

    Taking the above discussion into account, create a viable function to generate an alphabetical list of sites in a multisite network. Must be efficient for a 100+ (up to 1000) list items….

    Alphabetical by blog title, not by URL. 🙂

    that’s the kicker becasue it’s stored elsewhere.

    Indeed!! blogname is not in that table. So I went in via blog_id to blog_details and found blogname there. From my own very limited perspective, my code works splendidly, but I hear it’s a road hog. I’m unclear whether my code just plain sucks, or just sucks for retrieving over 100 list items. If it’s the latter, can I just cache it with set_option? If it’s the former, then SOS! And if needs a global table… well then… the ship is going down!

    I just thought of something… and this is more becasue I know you’re looking to learn how. 😉

    Go check out the Buddypress plugin and have a look at the code. You’ll wade through a lot because you aren’t interested in most of it, BUT it *does* keep a list of blogs alphabatized by title. And it keeps it in its own table. 😉

Viewing 15 replies - 1 through 15 (of 18 total)
  • The topic ‘suicidal multisite database queries’ is closed to new replies.