Jump to content

Search the Community

Showing results for tags 'speed'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Welcome to PHP Freaks
    • Announcements
    • Introductions
  • PHP Coding
    • PHP Coding Help
    • Regex Help
    • Third Party Scripts
    • FAQ/Code Snippet Repository
  • SQL / Database
    • MySQL Help
    • PostgreSQL
    • Microsoft SQL - MSSQL
    • Other RDBMS and SQL dialects
  • Client Side
    • HTML Help
    • CSS Help
    • Javascript Help
    • Other
  • Applications and Frameworks
    • Applications
    • Frameworks
    • Other Libraries
  • Web Server Administration
    • PHP Installation and Configuration
    • Linux
    • Apache HTTP Server
    • Microsoft IIS
    • Other Web Server Software
  • Other
    • Application Design
    • Other Programming Languages
    • Editor Help (PhpStorm, VS Code, etc)
    • Website Critique
    • Beta Test Your Stuff!
  • Freelance, Contracts, Employment, etc.
    • Services Offered
    • Job Offerings
  • General Discussion
    • PHPFreaks.com Website Feedback
    • Miscellaneous

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests


Age


Donation Link

Found 3 results

  1. I am emailing off my server and its really slow. I think the problem is my logic, not the code. Current speed is about 30-40 emails per minute. I need to get it up to at least 175/min but I was really hoping for more like 700/min. The server is a Dual Hexa-Core 2.4Ghz with 32gb ram and dedicated to only this task. All server reports show low resource usage. The logic flow is like this: 1. Get email content. 2. Loop through records in database. 3. Check record against some other tables (ie unsubscribe table) 4. If record checks then send email by file_get_contents passing all the email content through get variables. 5. Other script is basic email script. See below: <?php //GET VAR INPUT $to = htmlentities($_GET['to']); $friendly_from = htmlentities($_GET['friendly_from']); $email_from = htmlentities($_GET['email_from']); $subject = htmlentities($_GET['subject']); $text = htmlentities($_GET['text']); $html = $_GET['html']; $html = '<head><title>' . $subject . '</title></head><body>' . $html . '</body>'; //SEND EMAIL # Setup mime boundary $mime_boundary = 'Multipart_Boundary_x' . md5(time()) . 'x'; $headers = "MIME-Version: 1.0\r\n"; $headers .= "Content-Type: multipart/alternative; boundary=\"$mime_boundary\"\r\n"; $headers .= "Content-Transfer-Encoding: 7bit\r\n"; $headers .= "Reply-To: " . $email_from . " \r\n"; # Add in plain text version $body.= "--$mime_boundary\n"; $body.= "Content-Type: text/plain; charset=\"charset=us-ascii\"\n"; $body.= "Content-Transfer-Encoding: 7bit\n\n"; $body.= $text; $body.= "\n\n"; # Add in HTML version $body.= "--$mime_boundary\n"; $body.= "Content-Type: text/html; charset=\"UTF-8\"\n"; $body.= "Content-Transfer-Encoding: 7bit\n\n"; $body.= $html; $body.= "\n\n"; # End email $body.= "--$mime_boundary--\n"; # <-- Notice trailing --, required to close email body for mime's # Finish off headers $headers .= "From: " . $friendly_from . " <" . $email_from . ">\r\n"; $headers .= "X-Sender-IP: " . $_SERVER[SERVER_ADDR] . "\r\n"; $headers .= 'Date: ' . date('n/d/Y g:i A') . "\r\n"; //Mail it out $response = mail($to, $subject, $body, $headers); //RESPONSE (Success = 1, Fail = 0) echo $response; die(); ?> The entire thing runs on a cron job every minute checking with a lock to prevent multiple instances as the file usually executes for about 5 minutes. I have tried turning off parts of the code and retesting such as many of my database look ups but as you would expect they optimize by like 1% or something very little like that. The thing that takes a long time is the actual sending of the email. So how to get that to go faster?
  2. My web-page loops through and echoes all 50,000 entries from a several-column mySQL database (part number, price, description, etc.). It is tedious for me, because It takes about a full minute for my browser to completely load it. I had an idea: Use "ob_start" at the beginning of the script, and insert an "ob_end_flush()" assertion in my "while" loop to get executed every 50 loops, like this: <?php ob_start(); ## access big mySQL database and output it to browser $i=0; while ( $a_row = mysql_fetch_assoc($result) ) { # echo output to browser if($i %50 == 0) { ob_end_flush(); } } ?> That cut the time down from 60 seconds down to about a second and a half. The problem is, the results are goofy (several mySQL rows missing, didn't get output to browser). Is there a better way? Or, am I on the right track? And, if I'm on the right track, what can I do to fix this? Thank you.
  3. So just to preface this, I have been part of two operations (one as developer, one with a 3rd party company developing) where the business was forced to cease due to difficulties in database load balancing and lots of people lost lots of money. I am talking about big data and high performance needed at the same time. So for my new project, I am going to try and design it around having a forever expanding infrastructure of servers but that means setting it correctly from the beginning. I have put together some ideas for possible ways to split the database load across multiple servers. Any input to which idea(s) are best would be great so I know which to explore further. Also any relevant info on this type of thing would be helpful as this is the first time I am personally doing this. Thanks!
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.