Jump to content

guyfromfl

Members
  • Posts

    286
  • Joined

  • Last visited

    Never

About guyfromfl

  • Birthday 12/28/1980

Contact Methods

  • AIM
    gtrplr666

Profile Information

  • Gender
    Male
  • Location
    Daytona Beach, FL

guyfromfl's Achievements

Regular Member

Regular Member (3/5)

0

Reputation

  1. Cool, Try downloading MySQL Workbench, it will help you out alot.. Its a GUI tool from MySQL Let us know if you have any questions.
  2. No, if you did you would disrupt the data integrity. There would be no way to uniquely identify them. You should create a Database GuestList, then put those 3 tables in it.
  3. Just a simple schema to get you started: Table GiftList ID_GiftList (Primary Key, Auto Increment, Not Null...) Owner (Integer that holds ID_Users from User table) ExpireDate Table GiftListItems ID_GiftListItem (Primary Key, Auto Increment, Not Null...) GiftList (Stores the integer of the GiftList ID_GiftList... Identifies which GiftList it belongs) Description Table Users ID_Users Name Username Password UserLevel
  4. there could be potentially a fair few of these lists needed meaning having multiple databases could get confusing. You would store all the data in 1 table, with a ListID index, then SELECT * WHERE ListID='$desiredList'..
  5. It looks like you're reading the whole file into an array, then changing everything to Reserved. What is uniquely identifying which row gets changed to Reserved?
  6. for the $ctype in the switch try MIME not application/pdf.. I have a similar script, and that is what I'm using.
  7. I know this is right in front of me, but I need to add an element to each row of an array.. I have this array: Array ( [0] => Array ( [Elements] => values ) [1] => Array ( [Elements] => values ) ) Now, I want to add an element to the end of each one that holds the file name of the originating data. The part of code this takes place in, is in a class method to look for duplicates already in the database. If it is a duplicate, we add the $fileData iteration to the $duplicates array which gets returned to the calling function. It basically looks like this: Calling Code: <?php while($data = fgetcsv($handle)) { $leadDataLine = array_combine($headers, $data); // Some data formatting on $leadDataLine not important for this question... // Add the line to the stack $leadData[] = $leadDataLine; //array_push($leadData, $leadDataLine); unset($leadDataLine); } $dup[] = $lead->process($leadData); ?> The lead class: <?php public function process(&$fileData) { $duplicates = array(); // Process the information foreach($fileData as $row) { // If not a duplicate add to the database if (!$this->isDuplicate($row)) { // Add the lead to the database. $this->add($row); } else { // is a duplicate, add to $dup $duplicates[] = array("Elements" => $row['Values']); /* * Here is where I want to add the file name to the end of $duplicates * This has to be here because this class handles different sources of data, * Not all data will have a FileName key */ if (array_key_exists("FileName", $row)) $duplicates["FileName"] = $row["FileName"]; // array_push($duplicates, $row["FileName"]); } } print_r($duplicates); return $duplicates; } ?> What happens with the code I have, or using array_push: Array ( [0] => Array ( [Elements] => values) [FileName] => correct_file.csv [1] => Array ( [Elements] => Values) ) Notice, it is not on element 1.. What am I doing wrong here.
  8. No problem... What TeNDoLLA is suggesting should be fine. There are some tutorials about sanitizing the data so nobody will do anything malicious... It's a never ending battle, and everybody has their own way. That should be a good start for you.
  9. That is a dirty example, but I would recommend not allowing GET data to go into a SQL statement. You will need to "sanitize" the data. What I put there was just a quick way to illustrate what your OP was about.
  10. You just need to link to a file that will handle the request, and you identify it with GET data. I added how to link to that file: <?php require('config.php'); $sql=mysql_query("SELECT * FROM `infopessoal` ORDER BY `nome` ASC LIMIT 0 , 30"«»); while($row = mysql_fetch_array($sql)) { echo "<tr>"; echo "<td>" . $row['nome'] . "</td>"; echo "<td>" . $row['apelido'] . "</td>"; echo "<td>" . $row['num_funcionario'] . "</td>"; echo "<td><a href='editEmployee.php?id={$row['num_cuncionario']}'><img src='images/insert.gif'><a></td>"; echo "<td><img src='images/lupa.gif'></td>"; echo "<td><input type='image' src='images/perfil.gif'></td>"; echo "</tr>"; } Then in editEmployee.php: <?php $sql = "SELECT * FROM `infopessoal` WHERE 'num_cuncionario'={$_GET['id']} LIMIT 1" $employee = mysql_fetch_array($sql)) // Build the table to display data echo "<input type='text' value='{$employee['nome']}' />"; // etc... ?>
  11. I'm suffering from Monday impaired thinking... need a little help.. Basically, I am taking an array of CSV files, and going through each of them. The problem is the code I have written is designed around different header names...For example, the old way has First Name and Last Name fields. The new on put them together into one field called Full Name. In order to use my old code, I just need to take Full Name and explode the values at the first space " ". For some reason this doesn't work and I cant figure it out. It will push First Name on the end of the first element, with no value... All the echos are to help try to debug.. foreach($fileList as $f) { if (($handle = fopen(UPLOAD_PATH.$f, "r")) != FALSE) { $headers = fgetcsv($handle); echo $handle; echo count($headers); // Read the rest of the file while($data = fgetcsv($handle)) { echo strrpos($data['FULL NAME'], " ") . "<br />"; $leadData[] = array_combine($headers, $data); $leadData['First Name'] = substr($leadData['FULL NAME'], 0, strrpos($leadData['FULL NAME'], " ")); } echo $leadData['First Name'] . "<br />"; print_r($leadData); } } Output: Any help would be much appreciated. Thanks
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.