Package home | Report new bug | New search | Development Roadmap Status: Open | Feedback | All | Closed Since Version 1.2.7

Bug #2050 Messages get sent multiple times
Submitted: 2004-08-04 09:35 UTC
From: arthur at movenext dot nl Assigned: quipo
Status: Wont fix Package: Mail_Queue
PHP Version: 4.3.8 OS: Linux
Roadmaps: (Not assigned)    
Subscription  
Comments Add Comment Add patch


Anyone can comment on a bug. Have a simpler test case? Does it work for you on a different platform? Let us know! Just going to say 'Me too!'? Don't clutter the database with that please !
Your email address:
MUST BE VALID
Solve the problem : 34 + 37 = ?

 
 [2004-08-04 09:35 UTC] arthur at movenext dot nl
Description: ------------ When two scripts use the mail queue to send messages at the same time, some emails are sent twice. This happens when for example two scripts initiated by a crontab on different servers try to send messages from the same mail queue. I am using a Postgres database as container and 'mail' as mail driver. Reproduce code: --------------- require_once "Mail/Queue.php"; $mail_queue =& new Mail_Queue($db_options, $mail_options); /* really sending the messages */ $return = $mail_queue->sendMailsInQueue($this->_SETTING['queue']['max_mails']); print_r("Status: $return");

Comments

 [2005-05-18 20:20 UTC] realmlord at hotmail dot com
I reproduce using FreeBSD and MySQL db
 [2005-08-19 01:19 UTC] ronin_san at rednoize dot com
I have figured out something simmilar when using the smtp driver. I get up to 4 mails per recipient. This happens when I use a crontab to send the mails. The enqueueing takes some time, when there are hundrets or more of mails to send. Now, that is might the problem: When the "sending" script is called, while the enqueing script is still running, emails are sent as often, as the enqeuing takes its time. No matter if the deleteAfterSend set to true in the put method. A solution is to run the crontabs not at the same time. ie run the enqueue script just 4 times a day and the sending script just a few minutes befor that. If the enqueueing takes no longer than 6 hours, it will work fine. maybe I have missed something,... but I wasn't able to fiugure it out.
 [2006-03-03 15:04 UTC] ben at nlcweb dot com (ben)
I am having the same trouble. I too think it is related to my crontab. I have ~2000 emails to send, done in 200 email batches which are sent every 5 minutes. Apparently, if there is an SMTP problem and the script hangs, the crontab runs the script again (after 5 minutes) and it queues up another copy of the email, so some of my recipients are receiving up to 8 copies.
 [2006-03-03 15:20 UTC] pedro dot vera at gmail dot com (Pedro Vera)
We just got burned by this bug. Ours was set to send 2500 messages every 5 minutes. We sent 500 on the first batch, then about 1200 in the second. A bunch of people got three messages instead of one. Argh.
 [2006-03-08 14:14 UTC] benny dot butler at americanfamilyfunds dot com (Benny Butler)
Don't blame pear for this one. I have a a mailing script that sends out 14k emails in the morning. It just uses mail() to do it, and it has sent up to 30 to the same person (always if the server load gets to high.) I originally had it going as fast as it could, but that created a lot of dupes, then I had it throttled back to do batches, but some people would still get 2-3. Since then I have created another table in mysql to keep up with that day's log. I select my email out of one table, then delete it out of that table, then check my log table to see if it's already been sent, and if it hasn't send it. If it has, move on. The the loop is while(1){mailscript;wait(1)} This REALLY slows down how I can process, especially since I have over a million rows in the log, but no dupes since I did this. Well, there was one other step. I also told it, each time it loops, to check the 5min server load. If it's over 4, I set the wait to be 20... that keeps the server load down. THe biggest cause that I've found for the whole thing is DNS queries taking to long.
 [2006-03-09 21:53 UTC] ben dot litton at gmail dot com (Ben Litton)
I may have a solution. I opted for a redundant solution. First, in your cron script, try this... #!/usr/bin/php <?php $filename = '/whatever/path/you/want/mail.lock'; $fp = @fopen($filename, 'x'); if (!$fp) exit; //run mail daemon here. fclose($fp); unlink($filename); ?> If the lockfile is absent, it runs, if it's there, it's already running, and it aborts. This begs the question, what if the cron script dies during the run and the lockfile remains? I modified the PEAR code. 1. add a column called ptime to your table, default to null, it's a datetime. 2. We use db as our container, so I added these to the container db.php file. In the constructor, ~line 65 /** * @var string one day ago, this is compared to current processing time to prevent duplicate e-mails */ var $olddate; /** * @var string this is the timestamp of when this class was initialized. It is inserted into the database under ptime as the process time */ var $ptime; The _preload function ~line 125: function _preload() { $query = sprintf("SELECT * FROM %s WHERE sent_time IS NULL AND (ptime IS NULL or ptime < %s) AND try_sent < %d AND %s > time_to_send ORDER BY time_to_send", $this->mail_table, $this->db->quote($this->olddate), $this->try, $this->db->quote(date("Y-m-d H:i:s")) ); $query = $this->db->modifyLimitQuery($query, $this->offset, $this->limit); if (DB::isError($query)) { return new Mail_Queue_Error(MAILQUEUE_ERROR_QUERY_FAILED, $this->pearErrorMode, E_USER_ERROR, __FILE__, __LINE__, 'DB::modifyLimitQuery failed - '.$query->toString()); } $res = $this->db->query($query); if (DB::isError($res)) { return new Mail_Queue_Error(MAILQUEUE_ERROR_QUERY_FAILED, $this->pearErrorMode, E_USER_ERROR, __FILE__, __LINE__, 'DB::query failed - "'.$query.'" - '.$res->toString()); } $this->_last_item = 0; $this->queue_data = array(); //reset buffer $id_increment = 0; $id_list = array(); while ($row = $res->fetchRow(DB_FETCHMODE_ASSOC)) { if (!is_array($row)) { return new Mail_Queue_Error(MAILQUEUE_ERROR_QUERY_FAILED, $this->pearErrorMode, E_USER_ERROR, __FILE__, __LINE__, 'DB::query failed - "'.$query.'" - '.$res->toString()); } $this->queue_data[$this->_last_item] = new Mail_Queue_Body( $row['id'], $row['create_time'], $row['time_to_send'], $row['sent_time'], $row['id_user'], $row['ip'], $row['sender'], $row['recipient'], unserialize($row['headers']), unserialize($row['body']), $row['delete_after_send'], $row['try_sent'] ); $id_list[$id_increment++] = $row['id']; $this->_last_item++; } if (count($id_list) > 0) { $query = sprintf("update %s set ptime = %s where id in (%s)", $this->mail_table, $this->db->quote($this->ptime), implode(',',$id_list)); $res = $this->db->query($query); if (DB::isError($res)) { return new Mail_Queue_Error(MAILQUEUE_ERROR_QUERY_FAILED, $this->pearErrorMode, E_USER_ERROR, __FILE__, __LINE__, 'DB::query failed - "'.$query.'" - '.$res->toString()); } } return true; }
 [2006-09-08 23:21 UTC] ben at nlcweb dot com
Can an admin verify the above posted method? I'm having the same troubles with a client's customers receiving up to 10 duplicate copies of the same email due to this problem. I would like to hear an update on people who have tried it or if something is in the works for the Mail Queue package...
 [2007-01-20 11:04 UTC] quipo (Lorenzo Alberton)
Hi, while this is not strictly a Mail_Queue bug, but rather an unfortunate timing issue with cron and the smtp queue, I agree that Mail_Queue doesn't help in preventing this problem. The current Mail_Queue v.1.x can't be fixed since it would require a BC break, and that's not allowed by PEAR rules for 'stable' packages. A new Mail_Queue2 is in the works, though, and it will provide a sort of row-level locking to avoid concurrency problems like this one.
 [2014-11-05 15:07 UTC] mscarda (Mike Score)
Hi everybody, I'm having this issue: sometime my clients gets double email, sometimes not. What it can be? How can I fix it? Can someone please help me? Thanks Mike