Package home | Report new bug | New search | Development Roadmap Status: Open | Feedback | All | Closed Since Version 1.1.4

Bug #2110 long delay until "Save as..." popup
Submitted: 2004-08-11 05:51 UTC
From: matthewh at meta-bit dot com Assigned: mike
Status: Closed Package: HTTP_Download
PHP Version: 4.3.6 OS: WinXP
Roadmaps: (Not assigned)    
Subscription  
Comments Add Comment Add patch


Anyone can comment on a bug. Have a simpler test case? Does it work for you on a different platform? Let us know! Just going to say 'Me too!'? Don't clutter the database with that please !
Your email address:
MUST BE VALID
Solve the problem : 41 + 12 = ?

 
 [2004-08-11 05:51 UTC] matthewh at meta-bit dot com
Description: ------------ We're currently writing a secure web download application with MVC-Phrame and PEAR::HTTP::Download. I've written some working code that receives a download request POSTed to a Phrame Action, then initiates a HTTP::Download based response. This code works fine for various Mime-types [Mime Magic is used along with the HTTP Content Disposition etc], but for larger files - 16Mbytes - 50 Mbytes - the download fails. Tests within our development env. show that these large files are being loaded into the servers runtime/RAM memory heap prior to download. We've selected the download as an 'Attachment' setting within HTTP::Download. Within the development environment we run PHP4.3.6 as an Apache SAPI module. Similarly, the production CGI based enviroment exhibits similar problems. The production enviroment has the following PHP4 settings: implicit_flush Off Off output_buffering no value no value output_handler no value no value Development has: flush on, buffering at 4096, no special handler. Should the download request be: - processed as a HTTP GET (via a client side redirect), or - processed within a HTTP POST but with a custom output buffer handler, or - processed with a mockup multi-body-part HTTP header to force splitting of the download file into smaller chunks [We can't alter the php.ini's output_buffer setting in production] Any suggestions for a workaround ? Reproduce code: --------------- Examination of the HTTP::Download::send() code shows it uses PHP output buffering wrapped about the HTTP body response: -----PEAR::HTTP::Download excerpt start------------ function send() { ..... ob_start(); ..... $this->sendChunks($chunks)) { ob_end_clean(); ..... $this->sendHeaders(); } -----PEAR::HTTP::Download excerpt end-------------- Expected result: ---------------- The POST HTTP response downloads the file without out-of-memory problems, 'streaming' the file to the client. Actual result: -------------- The following error is logged within the Web Server error log: ----error.log-start---- [Wed Aug 11 01:25:04 2004] [error] [client 203.23.236.66] Premature end of script headers: /u/httpd/cgi-bin/php4.cgi. ----error.log-end---- Basically, the HTTP response is started, and then the connection is prematurely terminated [most likely due to an out of memory error].

Comments

 [2004-08-11 10:53 UTC] mike
Please try this patch: --- e:\MyTemp\Tortoise1588.rev.1.38-Download.php 2004-08-05 15:15:04.000000000 +0200 +++ W:\WWW\mike\pear\HTTP_Download\Download.php 2004-08-11 12:50:20.171875000 +0200 @@ -187,6 +187,14 @@ * @var string */ var $etag = ''; + + /** + * Headers Sent + * + * @access protected + * @var bool + */ + var $headersSent = false; // }}} // {{{ constructor @@ -567,7 +575,9 @@ return $e; } - $this->sendHeaders(); + if (!$this->headersSent) { + $this->sendHeaders(); + } return true; } @@ -767,6 +777,13 @@ "Content-Range: $range\n\n"; } elseif ($this->isRangeRequest()) { $this->headers['Content-Range'] = $range; + } else { + // temp hack to disable output buffering + // if we send the whole file - avoiding too + // much memory consumption with big files + $this->sendHeaders(); + $this->headersSent = true; + ob_end_flush(); } if ($this->data) {
 [2004-08-12 04:44 UTC] matthewh at meta-bit dot com
Mike, We've incorporated the various patches as per your suggestion but in examining the system runtime heap, it's obvious that the whole of the data file in still being loaded into memory, even if the output buffering is turned off whilst transferring the HTTP body. Indeed, in attempting a download of, for instance, a 50Mbyte file, the system memory jumps 50Mbytes during the transfer. Whilst this works okay on our development platform, the production system still croaks. I suspect the following line is to blame: ......in sendChunk()..... fseek($this->handle, $offset); echo fread($this->handle, $length); // read 50Mb into a string in RAM !! .................................... I'd like to suggest adding a specific method for managing the download of a file through use of a file buffer. I've attached my attempt at this method and the patch within the existing HTTP_Download::sendChunk() method to utilise the new _sendFile() method: ---start--Modified HTTP_Download.php ------------------ // 16K transfer file buffer per connection define( 'HTTP_DOWNLOAD_DEFAULT_FILE_BUFFER_SIZE', 16*1024 ); ..... function sendChunk($chunk, $cType = null, $bound = null) { ... if (!$this->handle) { $this->handle = fopen($this->file, 'rb'); // btw - where is the fclose() ? } // MiH added... $this->_sendFile( $this->handle, $offset, $length ); } return true; } // sendChunk() ----------------------------------------- /** * Send a file (or portion thereof) to the browser. * * Doesn't send raw 'data' resources - just handles file scenario. * * Uses a file buffer (by default, of size HTTP_DOWNLOAD_FILE_BUFFER_SIZE) * for transfers so that large files don't waste server resources. * * Avoids out of memory errors for large (50Mbyte) sized transfers. * * Also checks the connection status to see if the client has cancelled the * connection and aborts the transfer if so. * * @param int aHandle a filestream handle (eg, from fopen) * @param int aStartOffset byte offset within the file to start reading from (0 = start) * @param int aTotalBytes number of bytes to transfer * @param int aBufferSize (optional) * @return bool true on success - all bytes transferred | * false on error - client aborted connection, * file i/o error etc. * @access private **/ function _sendFile( $aHandle, $aStartOffset, $aTotalBytes, $aBufferSize = HTTP_DOWNLOAD_DEFAULT_FILE_BUFFER_SIZE ) { $retVal = false; $bytesSent = 0; $bytesRemaining = $aTotalBytes; fseek( $aHandle, $aStartOffset ); while(!feof($aHandle) && ($bytesRemaining >0) && (connection_status()==0)) { if ($bytesRemaining < $aBufferSize) $transferSize = $bytesRemaining; else $transferSize = $aBufferSize; echo fread($aHandle, $transferSize ); flush(); $bytesSent += $transferSize; $bytesRemaining = $aTotalBytes - $bytesSent; } if ($bytesSent >= $aTotalBytes) $retVal = true; return $retVal; } // _sendFile() ---------------end suggested updates-------------------- The code is based on some of the suggestions within the discussion section of the online manual for fread() at http://www.php.net. Preliminary testing shows there is hardly any memory heap fluctuation in XP for large transfers. Could you vett this code and consider its inclusion ? Rgds, MiH
 [2004-08-12 10:55 UTC] mike
Please grab latest CVS version and check if that fixes your problem. Thanks
 [2004-08-25 13:35 UTC] mike
No feedback was provided. The bug is being suspended because we assume that you are no longer experiencing the problem. If this is not the case and you are able to provide the information that was requested earlier, please do so and change the status of the bug back to "Open". Thank you.
 [2004-08-26 06:50 UTC] matthewh at meta-bit dot com
[Apologies for the delay in further feedback, re: your HTTP_Download [v1.42] patch on CVS. It seems that there are indeed still some problems with the Download module. Whilst we can now get downloads to complete with the new patch, there is a delay to the initial response of the server, that varies depending upon the size of the file being transmitted: File Size: Delay until [Save As] appears: 4.9Mb 8 seconds 8Mb 31 seconds 16Mb 122 seconds 65Mb Did not respond - Internal Error ------------------- Indeed, examination with a packet sniffer shows, for the smallest of the above files, no data is received by the browser for 6-8 seconds. I inserted some PEAR logging within HTTP_Download v1.42: =========== = = PEAR::HTTP Download operations start. = =========== Aug 26 02:17:38 HTTP_Download [notice] send(): Chunks list init-ed:array ( 0 => array ( 0 => 0, 1 => 4970417, ), ) Aug 26 02:17:38 HTTP_Download [notice] setChunks(): sending as a single chunk.. Aug 26 02:17:38 HTTP_Download [notice] sendHeaders(): About to send HTTP Headers:array ( 'Content-Type' => 'application/x-octetstream', 'Accept-Ranges' => 'bytes', 'Connection' => 'close', 'X-Sent-By' => 'PEAR::HTTP::Download', 'Content-Disposition' => 'attachment; filename="large-archive.zip"', 'Content-Length' => 4970417, ) Aug 26 02:17:38 HTTP_Download [notice] sendHeaders(): HTTP headers sent to output buffer Aug 26 02:17:38 HTTP_Download [notice] sendChunk(): sent 1st block ok. Bytes sent so far:2097152 Aug 26 02:17:39 HTTP_Download [notice] sendChunk(): sent 2th block ok. Bytes sent so far:4194304 Aug 26 02:17:40 HTTP_Download [notice] sendChunk(): sending last block. Aug 26 02:17:40 HTTP_Download [notice] sendChunk(): sent whole chunk ok. Aug 26 02:17:40 HTTP_Download [notice] send(): sent ok. About to close output buffer. Time now:1093501060 Aug 26 02:17:40 HTTP_Download [notice] send(): Output buffer closed. Time now:1093501060 Done. Aug 26 02:17:40 Action [info] DownloadContentDataAction:execute(): Transferred file large-archive.zip contentDataID:2 to user:guest ok Aug 26 02:17:40 Action [debug] DownloadContentDataAction:DownloadContentDataAction, forwarding to:false =================================== The above server-side log shows that although the transfer is complete almost immediately by the server [within 2 seconds or so], the actual response isn't received until a further 6 seconds. I don't think this is network latency, as the delay increases significantly for larger files - hinting at a buffering problem. Our relevant deployment PHP.ini settings are: ignore_user_abort Off Off implicit_flush Off Off output_buffering no value no value output_handler no value no value max_execution_time 30 30 max_input_time -1 -1 Rgds, Matthew
 [2004-08-26 07:14 UTC] matthewh at meta-bit dot com
Mike, After further testing, it seems indeed there was an output buffering problem within the v1.42 patch of HTTP_Download.php. I initially tried using ob_flush() within the sendChunk() loop to force the output of the body data. This made no difference, as it was the header that was being delayed in its transmission to the client. [Reception of the HTTP header is responsible for triggering the Save-as... File dialog]. I subsequently placed a ob_end_flush() call after the setHeaders() call within sendChunk(). This forces the headers to be sent to the client, and [apparently] turns off output buffering whilst the body data is sent. This code is based on the example at the end of the PHP manual entry for ob_implicit_flush(): http://au.php.net/manual/en/function.ob-implicit-flush.php Being a PHP neophyte, you may want to check the logic within my solution, namely: - should a loop of ob_end_flush() invocations be used to clear any output buffers enclosing the sendChunk() main loop, a la: http://au.php.net/manual/en/function.ob-end-flush.php - should the ob_end_flush() call actually be at the end of HTTP_Download::sendHeaders() method ? - I've currently placed an error suppressing '@' character in front of the ob_flush() call within sendChunk(). I believe there is a performance drain associated with doing this, and I'm not even sure its needed. - At present I've added a sleep(1) within the sendChunk() loop - mainly for debugging purposes so you can see the flow of packets coming through on a development box. This has the side effect of limiting the download bandwidth to a maximum of a single 2Mbyte data buffer per second [ie, 2MBytes per sec] transfer rate. This has the nice side effect that people could easily throttle/limit the download rate by altering the buffersize, via your HTTP_Download::setBufferSize() method. // Limit download rate to: 512 kbits per second for a single connection eg: HTTP_Download::setBufferSize( 512 * 1024 / 8 ); ....send() Some other interesting notes: - IE6 apparently has several problems in receiving downloads over SSL connections if cache headers are turned on. [See discussion PHP's fpassthru() manual page: http://au.php.net/manual/en/function.fpassthru.php ] - Mozilla Firefox v0.93+ downloads the file in a background thread whilst waiting for the user to enter the filename within the Save as... File Dialog box. Poor old dumb IE6SP1 stops listening on the socket until you actually select a filename, and confirm overwrites etc. Matt Sydney, Australia. [Mike - I'll e-mail you my patches directly for evaluation. Thanks for your help with all this]
 [2004-08-26 08:54 UTC] mike
Hi Matthew, we cannot call ob_end_flush() as this will terminate gzip encoding. Please see my reply to your mail. But the thottle thingy might be interesting to implement tough :) Thanks for all your efforts!
 [2004-08-31 10:28 UTC] mike
Suspended & changed title.
 [2005-01-14 10:45 UTC] mike
This bug has been fixed in CVS. In case this was a documentation problem, the fix will show up at the end of next Sunday (CET) on pear.php.net. In case this was a pear.php.net website problem, the change will show up on the website in short time. Thank you for the report, and for helping us make PEAR better.