jwwceo Posted July 8, 2013 Share Posted July 8, 2013 I am building an enterprise level php cloud based application that is by far the most secure thing I have ever had to write. In fact, after learning about all the normal weaknesses, it makes my older work look insecure and I'm a little embarrassed by it. ha! I am using 100% Prepared MySQLi Statements, rigorous session verification, encrypted sha512 passes using salts/hash, etc, using a Form Key on all form submissions ( good for 1 page submit). I am also scrubbing all POST values prior to adding to the database, and am even logging people out if GET is detected at all since it's not used on the site and the only reason would be shenanigans. I am also debating whether to compare all POST field names against an approved array called $trustedPostNames or something. I got the idea from X-cart, an old shopping cart which added this layer of security after getting a shitty audit a few years ago. This would prevent someone from attempting a XSS attack where they used POST to try and send something over to my page. I realize the prepared statement will prevent an injection attack, but it will still allow scripts to run, which could then be used to find other flaws, etc or find some weird way Apache handles a reserved word or something. Using my proposed system, if strange word is POSTED over, the script will just log them out and die immediately. Something like this: foreach($_POST as $k=>$v){ if(!in_array($v, $trustedPOST)){ die('eat shit fucker'); } } Overall, where is the line drawn between hyper redundancy and losing speed/resources from a clunkier interface?? Best, James Quote Link to comment Share on other sites More sharing options...
captbeagle Posted July 8, 2013 Share Posted July 8, 2013 In an enterprise app, I assume there would be a number of users logged in at any given time. Security is great, but users will get all sorts of ticked if they have to log in 5 times over a number of minutes because some search bot is hitting your domain. I'd check to see if it's got any parameters in $_GET before logging everyone out all the time from something benign. Quote Link to comment Share on other sites More sharing options...
kicken Posted July 8, 2013 Share Posted July 8, 2013 I'd probably classify both your _GET ban and trying to use a trusted input list as a bit on the overkill side. If things are coded properly, then extra GET or POST fields are not going to have any negative effect on your code, other than using a bit more memory (which your validations won't prevent anyway). A downside to such code also is that whenever you need to expand things you'll have to update the trusted fields list, or start allowing GET if you decide you need it. You're claim of not using GET variables at all also makes me think that you are probably using POST where you should be using GET. Things like search forms or listings should be using GET and not POST. Essentially any request that merely queries for data, but does not alter data, should be done using a GET request. Quote Link to comment Share on other sites More sharing options...
jwwceo Posted July 8, 2013 Author Share Posted July 8, 2013 well..not everyone gets kicked out. Just the person with the post names not matching..... Quote Link to comment Share on other sites More sharing options...
jwwceo Posted July 8, 2013 Author Share Posted July 8, 2013 @kicken.. we've gotten rid of GET by using Ajax to just query data without a page refresh. The ui is slick and doesn't appear to reload too much. Because of this interface, users wont see GET data in the URL so anything being typed in up there will be illegal. Aside from a small memory issues, any other issues using POST all time?? James Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.