Alex_ Posted November 13, 2014 Share Posted November 13, 2014 (edited) Hey. Got a slight issue, originally occured during the NodeJS version of the application, but it's also existing in the PHP version. (Yes, I have two versions, I know it's stupid) The issue at hand is that I've got a Script that calls an API for data around 5 times each time the script is executed. So imagine http://Mywebsite/mypath/generate public function doGenerate() { $restRequestHelper = new RestRequestHelper(...); $data1 = $restRequestHelper->fetch(...); ... ... $data5 = $restRequestHelper->fetch(...); } As you can imagine it hits a bit on the API's CPU since there's a lot of concurrent requests involving database operations. I know it's inevitable to optimize the API as well, but first I want to solve the client side of things, namely the mass refresh issue. If a Client has entered /generate/ and starts refreshing the page over and over again (or a Client just sending GET requests), the script keeps going for however many times they've refreshed and eventually the API will crash. You can think of it as a DOS attack, sort of. In the PHP version it's not nearly as harsh, but still harsh. It will keep doing it X amount of times and eventually show the response on the last request in the loop. In the NodeJS version, it's way worse. It's way faster, resulting in more concurrent requests on the API, making it crash faster. Same reason here, the spam. Anyone got a good way of blocking this kind of behavior of the client? PHP or NodeJS solutions will help. Think of the problem like this: $requestCount = 0; while($requestCount < 20) { $data1 = $restRequestHelper->fetch(...); $data2 = $restRequestHelper->fetch(...); $data3 = $restRequestHelper->fetch(...); $data4 = $restRequestHelper->fetch(...); $data5 = $restRequestHelper->fetch(...); $requestCount++; } Edited November 13, 2014 by Alex_ Quote Link to comment Share on other sites More sharing options...
QuickOldCar Posted November 13, 2014 Share Posted November 13, 2014 Maybe you can make a session per client before you call the ajax script If session exists don't call the script again. Remove the session when ajax script completes. http://php.net/manual/en/book.session.php Quote Link to comment Share on other sites More sharing options...
Alex_ Posted November 13, 2014 Author Share Posted November 13, 2014 Maybe you can make a session per client before you call the ajax script If session exists don't call the script again. Remove the session when ajax script completes. http://php.net/manual/en/book.session.php Yeah this I've tried, but doesn't seem to work. I set a Session variable to isRequesting = true and cross-reference it upon a request, but since the last request in the "spam-loop" is the one being considered the "renderer", the page ends up blank. The data was fetched on the first instance of the "spam-loop", where isRequesting = true is set, but lost at the end of the "spam-loop" (Since it's a new instance of the script upon a new request). Quote Link to comment Share on other sites More sharing options...
QuickOldCar Posted November 13, 2014 Share Posted November 13, 2014 Can you cache the responses and set expire times? At least that would take the load off the cpu. Quote Link to comment Share on other sites More sharing options...
Alex_ Posted November 13, 2014 Author Share Posted November 13, 2014 That could work, not entirely ideal but it could work. I'll give it a go. Quote Link to comment Share on other sites More sharing options...
QuickOldCar Posted November 13, 2014 Share Posted November 13, 2014 Is this some sort of attempt to limit requests like a queue? $requestCount = 0; while($requestCount < 20) { $data1 = $restRequestHelper->fetch(...); $data2 = $restRequestHelper->fetch(...); $data3 = $restRequestHelper->fetch(...); $data4 = $restRequestHelper->fetch(...); $data5 = $restRequestHelper->fetch(...); $requestCount++; } Quote Link to comment Share on other sites More sharing options...
Alex_ Posted November 13, 2014 Author Share Posted November 13, 2014 Is this some sort of attempt to limit requests like a queue? $requestCount = 0; while($requestCount < 20) { $data1 = $restRequestHelper->fetch(...); $data2 = $restRequestHelper->fetch(...); $data3 = $restRequestHelper->fetch(...); $data4 = $restRequestHelper->fetch(...); $data5 = $restRequestHelper->fetch(...); $requestCount++; } No that was just my example of what is technically happening when a user is spamming dat Refresh button. In essence it's just like looping over the function X times. Quote Link to comment Share on other sites More sharing options...
QuickOldCar Posted November 13, 2014 Share Posted November 13, 2014 Another idea I had was using the clients remote ip in a session, at least that could limit just them to a one time request. $remote_ip = $_SERVER['REMOTE_ADDR']; if (strstr($remote_ip, ', ')) { $ips = explode(', ', $remote_ip); $remote_ip = $ips[0]; } Quote Link to comment Share on other sites More sharing options...
Alex_ Posted November 13, 2014 Author Share Posted November 13, 2014 Yeah IP and unique Session ID is what I was trying, like trying to prevent further requests until the first one is finished, but attempting to limit it in any way ended up not working, because then i'd be stopping the "most recent" request object which in turn means the user gets no response at all. Quote Link to comment Share on other sites More sharing options...
QuickOldCar Posted November 13, 2014 Share Posted November 13, 2014 Understood. You don't really want to cache the api url itself, but the data within. Don't want to give people something for nothing, still log each request. I've been making a universal api front door so to say for all my api's. Does the key check, remote ip, goes through clients allowed domains. Then depending if is allowed will call on the particular api and it's url parameters to process the request. Those cached results could be in a folder outside www and included versus being publicly viewable. Even if saved the data as json or xml and parsed upon view versus something like a html cache. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.