Jump to content

Recommended Posts

I am emailing off my server and its really slow. I think the problem is my logic, not the code.

 

Current speed is about 30-40 emails per minute. I need to get it up to at least 175/min but I was really hoping for more like 700/min.

 

The server is a Dual Hexa-Core 2.4Ghz with 32gb ram and dedicated to only this task. All server reports show low resource usage.

 

The logic flow is like this:

 

1. Get email content.

 

2. Loop through records in database.

 

3. Check record against some other tables (ie unsubscribe table)

 

4. If record checks then send email by file_get_contents passing all the email content through get variables.

 

5. Other script is basic email script. See below:

<?php
//GET VAR INPUT
$to = htmlentities($_GET['to']);
$friendly_from = htmlentities($_GET['friendly_from']);
$email_from = htmlentities($_GET['email_from']);
$subject = htmlentities($_GET['subject']);
$text = htmlentities($_GET['text']);
$html = $_GET['html'];
$html = '<head><title>' . $subject . '</title></head><body>' . $html . '</body>';

//SEND EMAIL
# Setup mime boundary
$mime_boundary = 'Multipart_Boundary_x' . md5(time()) . 'x';

$headers  = "MIME-Version: 1.0\r\n";
$headers .= "Content-Type: multipart/alternative; boundary=\"$mime_boundary\"\r\n";
$headers .= "Content-Transfer-Encoding: 7bit\r\n";
$headers .= "Reply-To: " . $email_from . " \r\n";

# Add in plain text version
$body.= "--$mime_boundary\n";
$body.= "Content-Type: text/plain; charset=\"charset=us-ascii\"\n";
$body.= "Content-Transfer-Encoding: 7bit\n\n";
$body.= $text;
$body.= "\n\n";

# Add in HTML version
$body.= "--$mime_boundary\n";
$body.= "Content-Type: text/html; charset=\"UTF-8\"\n";
$body.= "Content-Transfer-Encoding: 7bit\n\n";
$body.= $html;
$body.= "\n\n";

# End email
$body.= "--$mime_boundary--\n"; # <-- Notice trailing --, required to close email body for mime's

# Finish off headers
$headers .= "From: " . $friendly_from . " <" . $email_from . ">\r\n";
$headers .= "X-Sender-IP: " . $_SERVER[SERVER_ADDR] . "\r\n";
$headers .= 'Date: ' . date('n/d/Y g:i A') . "\r\n";

//Mail it out
$response = mail($to, $subject, $body, $headers);

//RESPONSE (Success = 1, Fail = 0)
echo $response;
die();
?>

The entire thing runs on a cron job every minute checking with a lock to prevent multiple instances as the file usually executes for about 5 minutes.

 

I have tried turning off parts of the code and retesting such as many of my database look ups but as you would expect they optimize by like 1% or something very little like that. The thing that takes a long time is the actual sending of the email. So how to get that to go faster?

Link to comment
https://forums.phpfreaks.com/topic/297278-email-faster-its-really-slow/
Share on other sites

Well I know of email systems that email off servers much faster. They may not be using php though. I tried using multiple instances and it kinda went faster. I was able to achieve about 70/min but I couldn't get it to stay at that rate, it fluctuated wildly as the instances started and stopped. I am not opposed to switching languages or whatever to get it going. I really need to figure this out.

 

Totally different realm of code but MailChimp uses php and they are mailing 400mm per day for their clients. So it has to be possible.

Edited by brentman

This is not a PHP problem. PHP simply requests that the system send mail; it has no control after that.

 

The problem is likely with your mail server. I'd recommend you try an SMTP service to send that kind of volume. Setting up the infrastructure for that is not trivial.

How many emails do you actually need to send per month? Because 700/min is like 30million per month. What are you doing that requires that much email volume?

 

You're going to run into tons of headaches building this out yourself. You'll most certainly be hitting spam traps with that kind of volume.

 

Personally I don't mess with email myself. I use SMTP or pay someone else. Email sucks.

 

My advice is to find a consultant who has experience with large scale email infrastructure and go from there.

  • Like 1

Given that this is starting to sound like I'm helping a spammer, I'm a little loath to comment; however:

PHP is using a heck of a lot of overhead opening and closing a connection to the SMTP server. What you should do if you want to scale is to have PHP open a socket and speak directly to the SMTP server without opening and closing the connection for a given number of mails (which you should test against your SMTP server, you do NOT want to choke it to death).

If you know how to speak SMTP, this shouldn't be too difficult.

Edited by dalecosp

And the other thing is probably this:

 

 

 

 If record checks then send email by file_get_contents passing all the email content through get variables.


You're doing an HTTP GET to a PHP script for this?  Ouch!

This will at least DOUBLE your throughput:  format your file a la "a template", read the file ONCE into a variable, and str_replace() the DB values into the template var.

You're going to run into tons of headaches building this out yourself. You'll most certainly be hitting spam traps with that kind of volume.

 

I am a professional emailer. I have all that under control. Been out for awhile but now I am back and I don't want to spend $1000 to make $1000 you know?

 

Given that this is starting to sound like I'm helping a spammer, I'm a little loath to comment;...

 

No I email to peoples lists on their behalf. No spamming.

 

 

And the other thing is probably this:

 

 

 

 

You're doing an HTTP GET to a PHP script for this?  Ouch!

 

This will at least DOUBLE your throughput:  format your file a la "a template", read the file ONCE into a variable, and str_replace() the DB values into the template var.

 

 

I am not totally sure what you are suggesting instead. Reason I am doing this is so that I can send email from two different domains.

I am not totally sure what you are suggesting instead. Reason I am doing this is so that I can send email from two different domains.

Perhaps I misunderstand you.  What it sounds like you are saying is this:  A loop is running, and for each item in the loop you call file_get_contents() on an external webpage, maybe something like this:

 

$mail_body = file_get_contents("http://mysite.com/template_generator.php?firstname=$firstname&somevar=$variable");
If that is the case, it's terribly inefficient. You need a template in memory as a variable:

 

$mail_template = <<<EOT

Dear {firstname},

Are you still interested in purchasing the {name_of_product} from our web site?

Thanks,

MyCompany.com

EOT;
You should then call str_replace on this variable with the data for each iteration of the loop.

 

Now it sounds like *maybe* the problem is that you can't get all your data from more than one location all up front, which is also part of what I'd be doing in this sort of situation. If you can't, that is a bit of a problem. I don't freelance much, but I do need to get my convertible out of the $hop with its $hiny new engine and exhau$t $y$tem ...

This is what I am doing:

//bunch of logic up here

$to = 'jones@yahoo.com';
$from = 'admin';
$subject = 'Hello, quick reminder about your account';
$body = 'body here';

$random_domain = rand(1,2);

if ($random_domain == 1) {
     $domain = 'mysite.com';
}
else {
     $domain = 'myblog.net';
}

file_get_contents('http://' . $domain . '/?to=' . $to . '&from=' . $from . '@' . $domain . '&subject=' . $subject . '&body=' . $body);

//end script

Then the file I am file getting contents from is what I pasted above in prior message.

 

This way the message is all generated in one file but two different domains send it out. On each of myblog.net and mysite.com have the actual mail() command script on them.

Edited by brentman

Well, that's going to be your bottleneck right there. The way you're doing it, you have to wait for the HTTP response before your script finishes. That could be anywhere from 100ms to several seconds.

 

What you want to be doing is asynchronous requests, so that your script can end and start again before the HTTP call is finished. This should speed up your script considerably. There's many ways to do it, and many ways to scale it. You could start off with your one server, but if that is taking too long you can look into something like a Gearman cluster.

Well, that's going to be your bottleneck right there. The way you're doing it, you have to wait for the HTTP response before your script finishes. That could be anywhere from 100ms to several seconds.

 

What you want to be doing is asynchronous requests, so that your script can end and start again before the HTTP call is finished. This should speed up your script considerably. There's many ways to do it, and many ways to scale it. You could start off with your one server, but if that is taking too long you can look into something like a Gearman cluster.

 

I don't need to go crazy. It should be able to be accomplished with one machine. I am using like 1% of the machine power! Would using curl or fsocket be faster? I have never seen an example of asynchronous done so I don't really know where to begin logically to make that work. Google has not been extremely helpful with this either.

 

Found this that looks interesting as well: http://php.net/curl_multi_init

Edited by brentman

Perhaps a better question, are you even reading my posts? You DO NOT want to call file_get_contents() more than once, even if it is a local file/on the same server.

 

<?php

// start by getting your user/recipient data ... from DB?
$data_array = however_you_get_user_data( $from_db );

// Here's your basic email template.  This can be a var, HEREDOC string, whatever
// you could use file_get_contents() to get this ... ONE TIME, preferably from
// a local file.  It has placeholders (I've chosen to use a curly-bracket syntax
// similar to SMARTY).

$template = "

Dear {user},

   Welcome to {name_of_site} ... blah, blah, etc.

";

// this array holds the tags from the above template which are the merge fields
$replaced_array = array( "{user}", "{name_of_site}" );

//$data_array is assumed to be your multi-dimensional array of user datas...
foreach ( $data_array as $user_data_array ) {

   // of course $user_data_array has to have a proper format, which would be
   // based on the needed merge fields from the template

   $template_to_send = str_replace($replaced_array, $user_data_array, $template );

   //MAILer code goes here.

}
If you choose to speak ESMTP directly to a socket, you'd need to make up an array of templates and submit them $n at a time to the mail server yourself (but it would be much faster again):

 

<?php

// start by getting your user/recipient data ... from DB?
$data_array = however_you_get_user_data( $from_db );

// Here's your basic email template
$template = "

Dear {user},

   Welcome to {name_of_site} ... blah, blah, etc.

";

// this array holds the tags from the above template which are the merge fields
$replaced_array = array( "{user}", "{name_of_site}" );

$counter = 1;
$template_to_send = array();

//$data_array is assumed to be your multi-dimensional array of user datas...
foreach ( $data_array as $user_data_array ) {

   // of course $user_data_array has to have a proper format, which would be
   // based on the needed merge fields from the template

   $template_to_send[$counter] = str_replace($replaced_array, $user_data_array, $template );
   $counter++;

   if ($counter == 100 ) { //sending 100 pieces to the socket at a time
      foreach ($template_to_send as $message) {
         //MAILer code goes here, fsockopen, fputs("EHLO hostname"), etc. etc.
    
      } //foreach
      $counter = 1; //reset counter
   } //if

}//foreach $data_array ..
Edited by dalecosp

I replaced file_get_contents with the following cURL:

// GET Options
curl_setopt_array($curl, array(
    	CURLOPT_RETURNTRANSFER => 1,
    	CURLOPT_URL => $api_string,
    	CURLOPT_USERAGENT => 'Codular Sample cURL Request'
));
// Send Request
$resp = curl_exec($curl);
$htmlAttr = ($resp === FALSE)?'false':'success';

Improvement was negligible... ballpark of maybe 4% speed increase.

Because you're relying on HTTP transport to get DATA ... 

 

Lemme ask you this.  Say I wanted to get 500,000 names from a list.

 

Should I query a local database server or load a web page 500,000 times?

 

Sure. I just commented it out and ran it again and it was only about 10% faster. That still isn't my main bottleneck.

 

Check this:

 

1 Instance 1.97 Seconds/Email | Single Script Speed 1.60 Seconds/Email

2 Instances 1.37 Seconds/Email | Single Script Speed 1.98 Seconds/Email each

3 Instances 1.10 Seconds/Email | Single Script Speed 2.51 Seconds/Email each

4 Instances 1.04 Seconds/Email | Single Script Speed 3.10 Seconds/Email each

 

They are measured slightly different (so don't try to run math on it) but the trend is what is bothering me? Why would running a script once vs twice vs 3 vs 4 times make a difference in the performance of the script on an individual basis unless as a whole they are maxing something out. 

 

A professional emailer? Sounds like a spammer to me. Or is that an 'amateur' emailer?

 

There is no difference between a spammer and a regular emailer other than spammers don't have people opt in to be contacted and legit people do. I have people opt in to me and I provide a way to unsubscribe from my newsletters that I am sending, therefore not spammer. Lets talk about the topic please.

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.