• I’m making a plugin which extracts all the data from wordpress blog when the user activates the plugin and hits my database api. The code I’m using is as under:

    $allposts=get_posts($args);
    			echo "TotalLength: ".sizeof($allposts).";";
    			if($allposts)
    			{
    				$str=array();
    				$cnt=0;
    				$cnt1=0;
    				foreach($allposts as $post)
    				{
    					$cnt = $cnt + 1;
    					array_push($str,post_detail($post));
    					if($cnt == 10)
    					{
    						//echo "Count ".$cnt1." = ".$cnt;
    						$cnt1 = $cnt1 + 1;	
    
    						$cnt =0 ;
    						$str =array();
    					}
    				}
    				if($cnt>0)
    				{
    					//echo "Count ".$cnt1." = ".$cnt;
    					$jsonl = json_encode($str);
    					echo'<script type="text/javascript">hit_pd_server('.$jsonl.');</script>';
    				}
    			}

    I checked other related post plugin and found some of them using this method. However, my concern is if a blog contains tens of thousands of posts, would using get_posts() affect their site. I can handle the traffic on my api but I’m not sure if their blog can handle the requests made by get_posts().
    Also, what are the alternatives to get_posts() that can ensure that I fetch the data from client’s site one by one or be able to fetch them in batches instead of fetching all of them once.
    Any best practices on making related post type plugins that you can share will also be appreciated ?

Viewing 1 replies (of 1 total)
  • If you want to restrict data size and you should, you should setup arguments to pass in page # to retrieve and pass that into new WP_Query as an offset.

Viewing 1 replies (of 1 total)
  • The topic ‘Does using get_posts() to fetch all the post data affect the host site?’ is closed to new replies.