Jump to content

fanfavorite

Members
  • Posts

    344
  • Joined

  • Last visited

    Never

Everything posted by fanfavorite

  1. Wow, the Indexes made the query run in 1.8 seconds! Thank you so much!
  2. Ok will try the indexes. I usually do, but in this case we are using all fields, actually now that it is all in one query, we arn't using ID. The reason I do this is that it has to be within double quotes and I use single quotes to avoid having to escape all the double quotes needed. Your right though. Thanks for the help. Think I need to speak to a server administrator about MySQL performance for this stuff now.
  3. That works much faster, thanks. Still not sure how to get this to work on the large database though. Through PHPMyAdmin and PHP script as shown above (only with adjust queries) does the same thing and I guess times out before it outputs anything.
  4. Do you know if its a setting I could temporarily change? Actually it is not unique, as people can register with different UPC codes as well.
  5. The reason for the change is that EntryID is what links the Entry to the ReferAFriend. Then from there, we need to make sure that the email address in ReferAFriend is in Entries. As people can refer lots of people, but they only get a bonus entry if that email has entered. Sorry if it's confusing. I appreciate all the help. Do you happen to know why I can't even get 1 record outputted? I am running this query through MySQL as well and it runs fast with smaller tables, but it takes forever for this large table, which I am sure at some point will time out.
  6. Thanks, I had to change it to : SELECT a.* FROM Entries a UNION ALL SELECT a.* FROM Entries a WHERE UPC != 0 UNION ALL SELECT a.* FROM Entries a INNER JOIN ReferAFriend b ON b.EntryID = a.ID WHERE (SELECT count(ID) FROM Entries WHERE Email = b.Email) > 0 But it works as one query now. Still have slow issues though. If you look at the PHP code above, I changed it to: $f = fopen("drawdata.csv", "w"); fwrite($f,'"First Name","Last Name","Address","Address 2","City","Province","Postal Code","Phone","Date of Birth","Email"'); fwrite($f,"\n"); $q = mysql_query("SELECT a.* FROM Entries a UNION ALL SELECT a.* FROM Entries a WHERE UPC != 0 UNION ALL SELECT a.* FROM Entries a INNER JOIN ReferAFriend b ON b.EntryID = a.ID WHERE (SELECT count(ID) FROM Entries WHERE Email = b.Email) > 0"); while ($f = mysql_fetch_array($q)) { fwrite($f,'"'.$f['FirstName'].'","'.$f['LastName'].'","'.$f['Address'].'","'.$f['Address2'].'","'.$f['City'].'","'.$f['Province'].'","'.$f['PostalCode'].'","'.$f['Phone'].'","'.$f['DOB'].'","'.$f['Email'].'"'); fwrite($f,"\n"); } fclose($f); With this, it outputs the header into the file, but continues to be doing something, but nothing else gets put in the document. It is not the code, because I made a smaller Entries database that works fine. Is there a setting or something I can change in the my.ini?
  7. haha I was afraid of that. Ok basically "Entries" is all the registration data and "ReferAFriend" is all the people that have been sent a referral by one of the "Entries". What happens is everytime it meets criteria: Each Entry: - Look for UPC - Look for ReferAFriend emails addresses that are entries Everytime one of the above matches, they get a bonus entry into the draw, so the another row needs to be added to the final output. So for example, say you registered: Name,Email,UPC Keith,keith@phpfreaks.com,12345 That Entry in Entries, would be in the final output: Name,Email Keith,keith@phpfreaks.com Keith,keith@phpfreaks.com That is because you have a UPC. Now if you referred anyone and they are in the Entries. Then you get more rows. Hope that explains it better. Thanks.
  8. I am trying to get a list of all entries and duplicate the entries as many as they are elegible. The problem is teh queries take forever (hours upon hours). There are 250,000 rows in Entries and 50,000 rows in ReferAFriend. I can download the database no problem, it's the second query that seems to be causing the timing issue. I can even run a query like: SELECT b.*,(IF((SELECT count(a.Email) FROM ReferAFriend a WHERE a.EntryID = b.ID AND (SELECT count(ID) FROM Entries WHERE Email = a.Email) > 0 GROUP BY a.Email),(SELECT count(a.Email) FROM ReferAFriend a WHERE a.EntryID = b.ID AND (SELECT count(ID) FROM Entries WHERE Email = a.Email) > 0 GROUP BY a.Email),0) + IF(b.UPC != 0,2,1)) as EmailCount FROM Entries b This query causes no issues the problem is that it puts the number of Count into EmailCount. I want: instead of: Name,Other,EmailCount MyName,MyOther,3 show: Name,Other MyName,MyOther MyName,MyOther MyName,MyOther The php code and tables are below. Thanks in advance for the help. $f = fopen("drawdata.csv", "w"); fwrite($f,'"First Name","Last Name","Address","Address 2","City","Province","Postal Code","Phone","Date of Birth","Email"'); fwrite($f,"\n"); $q = mysql_query("SELECT c.*, IF(c.UPC != 0,2,1) as EmailCount FROM Entries c"); while ($f = mysql_fetch_array($q)) { $qa = mysql_query("SELECT count(DISTINCT a.Email) as EmailCount FROM ReferAFriend a, Entries b WHERE a.Email = b.Email AND a.EntryID = '$f[iD]'"); $fa = mysql_fetch_array($qa); for($count = 1; $count <= $f[EmailCount] + $fa[EmailCount]; $count++) { fwrite($f,'"'.$f['FirstName'].'","'.$f['LastName'].'","'.$f['Address'].'","'.$f['Address2'].'","'.$f['City'].'","'.$f['Province'].'","'.$f['PostalCode'].'","'.$f['Phone'].'","'.$f['DOB'].'","'.$f['Email'].'"'); fwrite($f,"\n"); } } fclose($f); CREATE TABLE IF NOT EXISTS `Entries` ( `ID` int(10) NOT NULL auto_increment, `FirstName` varchar(200) NOT NULL, `LastName` varchar(200) NOT NULL, `Address` varchar(200) NOT NULL, `Address2` varchar(200) NOT NULL, `City` varchar(100) NOT NULL, `Province` char(2) NOT NULL, `PostalCode` varchar(7) NOT NULL, `Phone` varchar(20) NOT NULL, `DOB` date NOT NULL, `Email` varchar(200) NOT NULL, `UPC` varchar(15) NOT NULL, `SignUp` tinyint(1) NOT NULL, PRIMARY KEY (`ID`) ) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=245135; CREATE TABLE IF NOT EXISTS `ReferAFriend` ( `ID` int(10) NOT NULL auto_increment, `EntryID` int(10) NOT NULL, `Email` varchar(200) NOT NULL, PRIMARY KEY (`ID`) ) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=49235;
  9. Perhaps you can look at a sample code and let me know how I would optimize it better? Thanks. <? $fn = "regdata.xls"; $header .= '<html xmlns:o="urn:schemas-microsoft-com:office:office"'; $header .= "\n"; $header .= 'xmlns:x="urn:schemas-microsoft-com:office:excel"'; $header .= "\n"; $header .= 'xmlns="http://www.w3.org/TR/REC-html40">'; $header .= "\n"; $header .= '<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">'; $header .= "\n"; $header .= '<html>'; $header .= "\n"; $header .= '<head>'; $header .= "\n"; $header .= '<meta http-equiv="Content-type" content="text/html;charset=utf-8" />'; $header .= "\n"; $header .= '<style id="Classeur1_16681_Styles">'; $header .= "\n"; $header .= '</style>'; $header .= "\n"; $header .= '</head>'; $header .= "\n"; $header .= "<body>"; $header .= "\n"; $header .= '<div id="Classeur1_16681" align=center x:publishsource="Excel">'; $data .= "\n"; $data .= '<table x:str border=0 cellpadding=0 cellspacing=0 width=100% style="border-collapse: collapse">'; $data .= "\n\t"; $data .= '<tr>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>First Name</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>Last Name</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>Address</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>Address 2</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>City</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>Province</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>Postal Code</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>Phone</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>Date of Birth</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>Email</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>UPC</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>Sign Up</b></td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap><b>Referrals</b></td>'; $data .= "\n\t"; $data .= '</tr>'; $q = mysql_query("SELECT *,max(SignUp) as SignUp,group_concat(UPC) as UPC,group_concat(ID) as ID FROM Entries GROUP BY Email"); while ($f = mysql_fetch_array($q)) { $data .= "\n\t"; $data .= '<tr>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['FirstName'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['LastName'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['Address'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['Address2'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['City'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['Province'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['PostalCode'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['Phone'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['DOB'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['Email'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['UPC'].'</td>'; $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$f['SignUp'].'</td>'; $qa = mysql_query("SELECT group_concat(Distinct Email) as Email FROM ReferAFriend WHERE EntryID IN ($f[iD])"); $fa = mysql_fetch_array($qa); $data .= "\n\t\t"; $data .= '<td class=xl2216681 nowrap>'.$fa['Email'].'</td>'; $data .= "\n\t"; $data .= '</tr>'; } $data .= "\n"; $data .= '</table>'; $data .= "\n"; $data .= '</div>'; $data .= "\n"; $data .= '</body>'; $data .= "\n"; $data .= '</html>'; header("Pragma: public"); header("Expires: 0"); header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); header("Content-Type: application/force-download"); header("Content-Type: application/octet-stream"); header("Content-Type: application/download");; header("Content-Disposition: attachment;filename=".$fn); header("Content-Transfer-Encoding: binary "); print $header.$data;
  10. Thanks guys. I think I will go the route of generating the file and then having a link display after the progress completes. I've never tried to store that much data in a session variable. Not even sure it will store properly or what the performance on the server would be.
  11. Well this isn't for one particular case, but some take over a minute to produce because the database is so large.
  12. Well I can, I just wanted to avoid the file actual being stored as I have to put a method of deleting it afterwards. Currently the files are just outputted on the fly. Is storing the file the only way I will be able to do this?
  13. Is there any way to have html output and then clear the output to allow for headers to be changed? This is still while the php file is executing. The reason I ask is that I am trying to output a progress bar of an excel file being created. So while the file is being created by php, I want to output to the browser a progress bar and then once it is complete, I need to set headers so that it downloads it as an excel file. Is this possible or does anyone else have any ideas? Should I maybe save the file on the web server temporarily and then redirect to the file afterwards? Any help would be appreciated. Thanks, -JC
  14. Ok, that makes sense. Do you have any sample php code to do that? Thanks.
  15. Try utf8_encode function http://php.net/manual/en/function.utf8-encode.php on the text coming from the mysql database.
  16. Thanks I didn't mention the asp.net thing before, so you didn't miss anything! Anyways, I tried the cURL option, but that doesn't seem to be working: $ch = curl_init('https://portal.ourdomain.com/commonheader.php'); curl_setopt($ch, CURLOPT_COOKIEJAR, "cookie.txt"); curl_setopt($ch, CURLOPT_COOKIEFILE, "cookie.txt"); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $result = @curl_exec($ch); echo $result;
  17. Thanks. cURL would probably work for the servers supporting php and cURL, the problem is one server is asp.net. There probably is a something that will do something like cURL for asp.net as well, so this may be an option. I am hoping there may be an easier option, but if not, I will work with this. Thanks.
  18. When doing the chunked method, did you get any type of error such as PHP timeout, etc? Have you tried fread?
  19. Hey All, What I am trying to do is link a whole bunch of our sites together to be seamless when users are on them, even though they are actually on different servers. For example, we have portal.ourdomain.com which runs our billing software, www.ourdomain.com which is our regular website, cp.ourdomain.com which is our control panel and mail.ourdomain.com which is our webmail login. We have a common header, which needs the session variables and such from the portal site. So we created a separate header file (on the portal site) that could be used by the other sites to read in. Now the separate file works by going to it directly portal.ourdomain.com/commonheader.php, however when included on the external pages, it doesn't get the number of items in the cart or the logged in state. I assume that this probably has to do with the sessions, but my question is, is there a way to process the file before it is included. I was hoping that it could read the file output, rather than generate it within the external site. Let me know if any one of you know of any ways to do what I am trying. If I am confusing you, just let me know what needs clarification. Thanks.
  20. Also, as for the encryption, everything needs to be encrypted. If someone did get access, you don't want them to be able to directly see all the bank information. It is another form of security.
  21. Is there a reason you are doing this? What I recommend is finding a merchant provider that gives you confirmation codes that you could use to bill an account. That way all the information is stored with the merchant provider. Then in the clients site you could store things like the last 4 digits of the credit card or some details that if found can't be used. If you really have to store information, you need to make sure you have extremely secure machines, with very limited access for anyone. This even means hosting companies and their employees. You pretty much need to have an on premise machine that is highly secure.
  22. Hey All, I have been struggling lately with execution times. I believe my host changed server settings, which they deny, but I keep getting 500 errors, which my host says is maximum execution time. Anyways, I have traced the script down and it seems that the main functions that take a long time are the copying ftp files. I have two functions, "copyDir" copies everything over to the server and "copyDir2" copies just the filenames with no data attached. Can someone please take a look and let me know if there is a faster way of doing these? function copyDir($conn_id,$path,$dst_dir){ $ar_files = @ftp_nlist($conn_id, $dst_dir); if (is_array($ar_files)){ for ($i=0;$i<sizeof($ar_files);$i++){ $st_file = basename($ar_files[$i]); if($st_file == '.' || $st_file == '..') continue; if (ftp_size($conn_id, $dst_dir.'/'.$st_file) == -1){ if (!is_dir($path.'/'.$st_file)) { @mkdir($path.'/'.$st_file); } if ($st_file != "layout" AND ($st_file != "templates" OR $_SESSION[ftp] != "anothertakestudios.com")) { copyDir($conn_id, $path.'/'.$st_file, $dst_dir.'/'.$st_file); } } else { @ftp_get($conn_id,$path.'/'.$st_file,$dst_dir.'/'.$st_file,FTP_BINARY); } } } } function copyDir2($conn_id,$path,$dst_dir){ $ar_files = @ftp_nlist($conn_id, $dst_dir); if (is_array($ar_files)){ for ($i=0;$i<sizeof($ar_files);$i++){ $st_file = basename($ar_files[$i]); if($st_file == '.' || $st_file == '..') continue; if (ftp_size($conn_id, $dst_dir.'/'.$st_file) == -1){ if (!is_dir($path.'/'.$st_file)) { @mkdir($path.'/'.$st_file); } copyDir2($conn_id, $path.'/'.$st_file, $dst_dir.'/'.$st_file); } else { $handle = @fopen($path.'/'.$st_file, "x"); fclose($handle); } } } } Thanks!
  23. I would normally do something like this, however it is not as simple as key = value. There are very complex statements that will be added that can't be categoried in a standard database schema.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.