The adventure continues..
Some more commits to repository in https://github.com/danielecr/CurlParallel, the goal is to parallelize a php application as much as possible in the task of network I/O concerns. This is the test class in the rep:
require_once "sender.class.php"; class TestCurl implements iSenderConsumer { private $url_list = array(); public function __construct(Sender $sender) { // read urls from a file, one by one $this->readUrls(); $this->sender = $sender; foreach ($this->url_list as $url) { if($url == '') continue; print "$url enqueued\n"; $curlo = $this->sender->addRecipient($url, $this); //print_r($curlo); // set parameters option for $curlo ... but even not //unset($curlo); } } public function readUrls() { $c = file_get_contents('urllist.url'); //print $c; $this->url_list = explode("\n", $c); } public function consumeCurlResponse(HttpResponse $object,Curl $curlo = NULL) { // I just want to know if all goes right print date('c') . " - " .$object->header_first_row. ' - ' .$object->getResponseCode() . " with a content of length: " . strlen($object->content) . " requested url: ". $curlo->getUrl() ."\n"; if($object->getResponseCode() != 200) { print $object->content; print $object->raw_headers; } } } $sender = new Sender(); $tc = new TestCurl($sender); $sender->execute(); sleep(10);
It has to be integrated in a object oriented project, and this should be such a pluggable pattern to include in. It could be used with post too. User agent is such a thing…