Jump to content

T-Bird

Members
  • Posts

    53
  • Joined

  • Last visited

    Never

Profile Information

  • Gender
    Not Telling

T-Bird's Achievements

Member

Member (2/5)

0

Reputation

  1. An HTML form can send as many pieces of information as you want. For each field in the form you want to get the data out of, add the 'name' attribute. Then the php script that receives the data can that information through the $_GET (or $_POST, or $_REQUEST) superglobal array. More information about forms: http://w3schools.com/html/html_forms.asp More information about $_GET: http://w3schools.com/php/php_get.asp I hope I understood your question correctly.
  2. I just realized that in such a scenario whenever a new member registers the php page would still need to use a statically defined user but this time the user would have privileges to alter my mysql user and grant tables. If anything this would be less secure. Guess I got in over my head on this one, sorry for wasting your time.
  3. Hello. Let me preface this by saying I'm not experienced in database architecture, I've only worked on a few small, simple, low-security (, MySQL) databases for blogs/forums. The following is the result of some brainstorming I did, there is probably a better method, I'm requesting feedback. I was recently trying to think of how I - in my limited experience - would go about making a database more secure than my typical method. Usually my database design consists of one user, Member, with read/write privileges on all tables for that database. I would then have tables that hold all of a particular type of data for the community - I.E. a users table or a posts table. In PHP I would have hard coded into the pages (or into an included file) the username and password for 'Member' which I would access the database with. As an example I would have a users table and a posts table that would look something like: Users ID, Username, PasswordHash, DoB, FirstName, LastName, JoinDate, etc, etc Posts ID, UserID, Text, Date, etc, etc Of course I would use php to limit the queries so that only information that belongs to a particular user would be given to them. However I got to thinking, if someone manages to find/crack the password for 'Member' then they can read/destroy the data for ALL other users since they have access to the whole Users table. Now obviously there are methods of securing databases, but I haven't picked up any books on database architecture (if you know a good one let me know), so this is what I got out of my personal brainstorming. Perhaps I'm spot on, perhaps I'm in left-field (probably the latter). Is it acceptable to have a separate user and separate tables created for every new member to my website? For example, when a user registers as a member and picks the user name 'JaneDoe' with the password 'Foo' I would create a user in my DBMS, 'JaneDoe', with the password 'Foo' I would then create the tables 'JaneDoePersonalInfo', 'JaneDoeCreditCards', 'JaneDoeFriends', etc. Only user JaneDoe would have access to the tables prefaced by her name, and she could access only her tables. My php pages, rather than having a global 'Member' user coded into the page, would take her username/password input and use these to log into the database. My thinking was that now, worst-case scenario, an attacker can only access one client's information for every password they steal - rather than getting all user information with one password. But I know very little about databases and even less about making them efficient and stable. SO my questions are: 1) Does this method work, or is there some major oversight on my part? 2) Do DBMSs (MySQL particularly) even allow you to have that many users/tables? 3) Are there any security disadvantages this way? 4) What are the performance (dis)advantages to this method in small, medium, or enterprise sized applications - will this even operate at an acceptable rate? 5) Is there a better (more accepted) way to do this? I wish I was a little more educated on the subject, but I figured the best way to learn is to ask. Please let me have any feedback (or good book recommendations). Thank you.
  4. Awesome. Worked great. I feel a little dense though.
  5. I have files that need to be downloaded only if the user has the correct password, thus I'm using the standard readfile() based download method as seen in example 1 of php.net's readfile manual. if (file_exists($file)) { header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename='.basename($file)); header('Content-Transfer-Encoding: binary'); header('Expires: 0'); header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); header('Pragma: public'); header('Content-Length: ' . filesize($file)); ob_clean(); flush(); readfile($file); exit; } The files I'm trying to transfer are on average 80MB zip files stored on the same box that is running the scripts. I have two problems which I think are related... but may not be. a) The user submits the password and it takes approximately a minute for the save dialogue box to appear. PHP logs a timeout error, but the file downloads fine. b) With files less than 60MB I can download several files at a time no sweat. With files over 60MB in size I can only download one file at a time. Anyone who tries to download the file while a transfer is in progress gets an empty placeholder file. I tried switching to a chunked transfer method. I got a code out of the comments on the php.net site that seemed promising and tried it. function readfile_chunked($filename,$retbytes=true) { $chunksize = 1*(1024*1024); // how many bytes per chunk $buffer = ''; $cnt =0; // $handle = fopen($filename, 'rb'); $handle = fopen($filename, 'rb'); if ($handle === false) { return false; } while (!feof($handle)) { $buffer = fread($handle, $chunksize); echo $buffer; ob_flush(); flush(); if ($retbytes) { $cnt += strlen($buffer); } } $status = fclose($handle); if ($retbytes && $status) { return $cnt; // return num. bytes delivered like readfile() does. } return $status; } This solved problem-a as data immediately started to transfer to the browser. I didn't get to test if it solved problem-b because after aprox 25-30 seconds php again (understandably) logged the timeout error and the transfer died. Any clue why php is refusing to allow multiple downloads on code a? Is there a way I can adjust PHP's timeout for only ONE page so that I can use code B without having all my pages use an ugly 1hour timeout? Any other suggestions?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.