Package home | Report new bug | New search | Development Roadmap Status: Open | Feedback | All | Closed Since Version 1.12.2

Bug #1112 Memory Leak in Pear DB
Submitted: 2004-04-01 16:10 UTC
From: bemis at docutronsystems dot com Assigned: danielc
Status: Bogus Package: DB
PHP Version: 4.3.4 OS: Fedora Core 1
Roadmaps: (Not assigned)    
Subscription  


 [2004-04-01 16:10 UTC] bemis at docutronsystems dot com
Description: ------------ The row->free() function doesn't work when the Results are empty. I wanted a basic daemon that selected every two seconds, and did an action based upon the select results. most of the time the select will return nothing. I am finding that after two days of running this piece of my daemon my computer is hanging. I can rewrite this in c, but my whole project is in PHP so I wanted to use PHP in case the project ever needs to be passed on. Reproduce code: --------------- #!/usr/bin/php -q <?php include 'DB.php'; $db_object = DB::connect('mysql://root:root@localhost/client_files', TRUE); $db_object->setFetchMode(DB_FETCHMODE_ASSOC); while(true) { usleep(500000);//dont go crazy $res=$db_object->query( "select doc_id from this_should_work where doc_id=2" ); $row=$res->fetchRow(); //this is a dumbed down version, but does the memleak $res->free(); unset($row); unset($res); echo memory_get_usage()."\n"; } ?> Expected result: ---------------- I expect that the memory doesn't increase: so I'd like to see the script output 502608 502608 502608 502608 502608 502608 502608 502608 502608 502608 502608 502608 502608 Actual result: -------------- what I actually see is our script increasing at a rate of about 120 bytes per loop: 502608 502784 502904 503024 503144 503264 503384 503504 503656 503776 503896 504016 504136 504256 504376 504496 504680 504800

Comments

 [2004-04-01 16:16 UTC] danielc
Change Category to "DB" from "Bug System."
 [2004-04-01 20:23 UTC] danielc
I can't exactly replicate your test because my version of PHP doesn't have memory_get_usage() enabled. So, I changed the test from an infinite while loop with a sleep to a for loop which runs 10,000 times: for ($Counter = 0; $Counter < 10000; $Counter++) { } I ran the script and looked at the process's memory usage via operating system tools (Task Manager's process list in Windows 2000). By the time I flip windows over to examine the memory usage, it has stabilized at a given level and then stays there until the process ends. A colleague did similarly on a Linux box. One interesting thing he mentioned was "it climbs gradually for about first 200 iterations and then stops climbing." So, what happens if you tweak your example to use a for loop like the one above, making the counter go up to 1000 itterations and changing: echo memory_get_usage()."\n"; to: if ($Counter % 50 == 1) { echo memory_get_usage()."\n"; } Thanks.
 [2004-04-01 20:36 UTC] bemis at docutronsystems dot com
new script as you requested #!/usr/bin/php -q <?php include 'DB.php'; $db_object = DB::connect('mysql://root:root@localhost/client_files', TRUE); $db_object->setFetchMode(DB_FETCHMODE_ASSOC); for( $i=0;$i<1000;$i++) { $res=$db_object->query( "select doc_id from this_should_work where doc_id=2" ); $row=$res->fetchRow(); //this is a dumbed down version, but does the memleak $res->free(); unset($row); unset($res); if( $i%50==1 ) echo memory_get_usage()."\n"; } ?> new results of that script [root@death poll]# ./try.php 503456 509680 515936 522448 528448 534448 538464 541264 544064 546864 549664 554512 557312 560112 562912 565712 568512 571312 574112 576912 -matt
 [2004-04-01 20:44 UTC] bemis at docutronsystems dot com
I upgraded with pear upgrade db, and got the following output [root@death poll]# pear upgrade DB downloading DB-1.6.1.tgz ... Starting to download DB-1.6.1.tgz (89,932 bytes) .....................done: 89,932 bytes upgrade ok: DB 1.6.1 then I ran it again [root@death poll]# ./try.php 521288 524488 527688 530888 534088 537288 537480 537480 537480 537480 537480 537480 537480 537480 537480 537480 537480 537480 537480 537480 I use yum upgrade as often as I see available upgrades. I dont think it upgraded pear though. I am sorry for any problems -matt
 [2004-04-01 20:51 UTC] danielc
And what happens if you assign by reference? Eg. changing: $res=$db_object->query... $row=$res->fetchRow(); to $res=&$db_object->query... $row=&$res->fetchRow(); And what happens if you call the MySQL methods directly: $link = mysql_connect('localhost', 'user', 'pw'); $db = mysql_select_db('name', $link); for ($i = 0; $i < 1000; $i++) { $res = mysql_query('select doc_id from ' . 'this_should_work where doc_id=2', $link); $row = mysql_fetch_assoc($res); mysql_free_result($res); unset($row); unset($res); if ($i % 50 == 1) { echo "$i " . memory_get_usage() . "\n"; } }
 [2004-04-01 20:53 UTC] danielc
OH! Thank you for thinking of upgrading DB and letting us know that doing so fixed the problem.
 [2004-04-01 20:56 UTC] bemis at docutronsystems dot com
after upgrading it does not seem to be a problem any more. I cranked the for loop up to 100000 and the memory size stayed the same after the 5th iteration.
 [2004-04-01 21:00 UTC] danielc
Flip status back to bogus.