Jump to content


  • Posts

  • Joined

  • Last visited

Profile Information

  • Gender
    Not Telling

jwwceo's Achievements

Advanced Member

Advanced Member (4/5)



  1. your session variables are not stored in the browsers cookies. Just a session ID, which your server uses to grab the related session variables.
  2. Hello, I have a secure login script that uses sessions to maintain a users logged in status. I regenerate the session ID on each page reload for added security. Sometimes, I will lose sessions, and I am hard time figuring out why. It seems to be when I have a slow internet connection. I am wondering what could be causing this?? I have a few ideas, do any of these seem plausible. -A max execution error, returns a fatal error. -An ajax request get sent before the previous request returns, so the session ID's dont match an I am kicked out. Any other ideas would be helpful. Best, James
  3. I am building an app which requires a very secure environment. Because of this, I am using a form key on every POST, both for regular forms and also Ajax calls. This key is stored as a session variable, and is reset whenever a POST occurs. A validation script checks the submitted key against the servers key and kicks he user out of they don't match. This ensures POSTs are coming from a trusted source. It works great, and I think has made the site very hard to attack or hack. That said, it has broken the browsers back button. If I hit back, the old form key is posted, and they don't match and it kicks the user out. I don't want a broken back button, but not sure what to do. Any ideas?? James
  4. @kicken.. we've gotten rid of GET by using Ajax to just query data without a page refresh. The ui is slick and doesn't appear to reload too much. Because of this interface, users wont see GET data in the URL so anything being typed in up there will be illegal. Aside from a small memory issues, any other issues using POST all time?? James
  5. well..not everyone gets kicked out. Just the person with the post names not matching.....
  6. I am building an enterprise level php cloud based application that is by far the most secure thing I have ever had to write. In fact, after learning about all the normal weaknesses, it makes my older work look insecure and I'm a little embarrassed by it. ha! I am using 100% Prepared MySQLi Statements, rigorous session verification, encrypted sha512 passes using salts/hash, etc, using a Form Key on all form submissions ( good for 1 page submit). I am also scrubbing all POST values prior to adding to the database, and am even logging people out if GET is detected at all since it's not used on the site and the only reason would be shenanigans. I am also debating whether to compare all POST field names against an approved array called $trustedPostNames or something. I got the idea from X-cart, an old shopping cart which added this layer of security after getting a shitty audit a few years ago. This would prevent someone from attempting a XSS attack where they used POST to try and send something over to my page. I realize the prepared statement will prevent an injection attack, but it will still allow scripts to run, which could then be used to find other flaws, etc or find some weird way Apache handles a reserved word or something. Using my proposed system, if strange word is POSTED over, the script will just log them out and die immediately. Something like this: foreach($_POST as $k=>$v){ if(!in_array($v, $trustedPOST)){ die('eat shit fucker'); } } Overall, where is the line drawn between hyper redundancy and losing speed/resources from a clunkier interface?? Best, James
  7. Hello All! I am developing an app for a customer that is quite a bit more complex than anything I've made before. Basically, I am creating a cloud based software app that will have businesses paying a monthly fee to gain access. Each user will have their own set of data and the data set could get large as time goes by. My question is: Do most applications like this create a new database for each user, or do users share tables within the same database. I can see advantages and disadvantages of each. I want the application to be fast, and be scalable so as users grow we can seamlessly add resources to keep the service snappy. Best, James
  8. UPDATE...there are about 18 million codes...they are all 4 characters...if this helps give the extent of the size issue.
  9. I am developing a database app for a client who needs to import hundreds of thousands of codes into the DB to check against. The codes are in 4 text files about 30MB each. The codes are 3 per line, then a line break, and 3 more. Ive written a script to parse out the line breaks, turn the data into an array, then loop over thay array and insert into the DB. The problem is these scripts take minutes to run using file_get_contents and by the time the data is ready the mmysql connection is gone. Plus even these files only work after Ive cut the files into about 1MB each, so each file is 30 smaller ones. Is there a way to just put the text file on the server, and have php search it using a GREP like function that won't be such a burden to work with. Any advice helps. James
  10. That works. Just off by one parenthesis. Thanks a million dude!!! JAmes
  11. Hmm....gettinng closer....but the query is still not adding the number of leads per month...this is what I am getting now... total jan feb mar apr may jun first_name last_name investors_id 1 0 0 0 0 0 0 xxx xxx 5 3 0 0 0 0 0 0 Billy Bob 112
  12. Hello, I am trying to write a report script that will take all my users, and tell me how many leads each has generated per month in a given year as well as the total per year. So the ideal outcome would be a table with users as the rows, and 13 columns across the top, one for each month and one for the total. The leads table has an investor ID field which matched the ID in the investor table, so the query needs to count the incidences of that ID when the date matches a certain month. The dates are in UnixTime stamp. Here is what I have so far: SELECT COUNT( leads.investor ) as total, MONTHNAME( FROM_UNIXTIME( leads.date ) = 'January' ) AS jan , MONTHNAME( FROM_UNIXTIME( leads.date ) = 'February') AS feb , MONTHNAME( FROM_UNIXTIME( leads.date ) = 'March' ) AS mar , MONTHNAME( FROM_UNIXTIME( leads.date ) = 'April' ) AS apr , MONTHNAME( FROM_UNIXTIME( leads.date ) = 'May' ) AS may , MONTHNAME( FROM_UNIXTIME( leads.date ) = 'June' ) AS jun , investors.first_name, investors.last_name, investors.investors_id FROM leads, investors WHERE YEAR( FROM_UNIXTIME( leads.date ) ) = '2010' AND leads.investor = investors.investors_id GROUP BY investors.investors_id LIMIT 0 , 30 I've only added 6 months while testing. The total part is working fine, for the whole year, but I am having a hard time getting the query to tally the leads by month. What am I missing here?? Thanks in advance!! James
  13. So how would I use MySQL to do this work? Would I query the DB in every iteration of the loop. I'm not sure what I would even be querying. Heres the basic structure I have in place now: $start_date = strtotime($row['int_accrual_date']); $end_date = strtotime($row['maturity_date']); while ($start_date <= $end_date){ #### DO SOME STUFF ### $start_date = strtotime('+1 month', $start_date); }
  14. Hello, I am making a website for a bond company, which has 30 year bonds. This puts the call dates at beyond 2038. None of my strtotime functions are working. Any ideas how to work around this. I think I am just storing the time as a varchar string in the DB. Can I just manually find what the strtotime value would be using sometimes like 30 times 31 556 926 seconds?? this doesnt account for leap years, etc I dont think? Any ideas?? James
  15. hello, google is indexing a page in my site which is a nonsense page, with a blank GET value like this: home.php?category= where Category is NULL. It just shows my home page without doing anything. I want to try and redirect this to my home page, to give the home page the most SEO power, so I wrote the following script: if ($category == '0' || $category == NULL){ unset($category); header("HTTP/1.1 301 Moved Permanently"); header("Location: home.php"); But this just sets my home page into an infinite loop, where the page trying to be displayed is home.php?category= Any ideas why that GET value is being preserved. James
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.