Jump to content

Recommended Posts

Hey.

 

Got a slight issue, originally occured during the NodeJS version of the application, but it's also existing in the PHP version. (Yes, I have two versions, I know it's stupid)

The issue at hand is that I've got a Script that calls an API for data around 5 times each time the script is executed. So imagine http://Mywebsite/mypath/generate

public function doGenerate() {
$restRequestHelper = new RestRequestHelper(...);
$data1 = $restRequestHelper->fetch(...);
...
...
$data5 = $restRequestHelper->fetch(...);
}

As you can imagine it hits a bit on the API's CPU since there's a lot of concurrent requests involving database operations. I know it's inevitable to optimize the API as well, but first I want to solve the client side of things, namely the mass refresh issue.

 

If a Client has entered /generate/ and starts refreshing the page over and over again (or a Client just sending GET requests), the script keeps going for however many times they've refreshed and eventually the API will crash. You can think of it as a DOS attack, sort of.

 

In the PHP version it's not nearly as harsh, but still harsh. It will keep doing it X amount of times and eventually show the response on the last request in the loop.

In the NodeJS version, it's way worse. It's way faster, resulting in more concurrent requests on the API, making it crash faster. Same reason here, the spam.

 

Anyone got a good way of blocking this kind of behavior of the client? PHP or NodeJS solutions will help. 

 

Think of the problem like this:

$requestCount = 0;
while($requestCount < 20) {
$data1 = $restRequestHelper->fetch(...);
$data2 = $restRequestHelper->fetch(...);
$data3 = $restRequestHelper->fetch(...);
$data4 = $restRequestHelper->fetch(...);
$data5 = $restRequestHelper->fetch(...);
$requestCount++;
}
Edited by Alex_
Link to comment
https://forums.phpfreaks.com/topic/292441-client-mass-refresh-vs-script/
Share on other sites

Maybe you can make a session per client before you call the ajax script

 

If session exists don't call the script again.

 

Remove the session when ajax script completes.

 

http://php.net/manual/en/book.session.php

Maybe you can make a session per client before you call the ajax script

 

If session exists don't call the script again.

 

Remove the session when ajax script completes.

 

http://php.net/manual/en/book.session.php

 

 

Yeah this I've tried, but doesn't seem to work. I set a Session variable to isRequesting = true and cross-reference it upon a request, but since the last request in the "spam-loop" is the one being considered the "renderer", the page ends up blank. The data was fetched on the first instance of the "spam-loop", where isRequesting = true is set, but lost at the end of the "spam-loop" (Since it's a new instance of the script upon a new request).

Is this some sort of attempt to limit requests like a queue?

$requestCount = 0;
while($requestCount < 20) {
$data1 = $restRequestHelper->fetch(...);
$data2 = $restRequestHelper->fetch(...);
$data3 = $restRequestHelper->fetch(...);
$data4 = $restRequestHelper->fetch(...);
$data5 = $restRequestHelper->fetch(...);
$requestCount++;
}

 

Is this some sort of attempt to limit requests like a queue?

$requestCount = 0;
while($requestCount < 20) {
$data1 = $restRequestHelper->fetch(...);
$data2 = $restRequestHelper->fetch(...);
$data3 = $restRequestHelper->fetch(...);
$data4 = $restRequestHelper->fetch(...);
$data5 = $restRequestHelper->fetch(...);
$requestCount++;
}

 

No that was just my example of what is technically happening when a user is spamming dat Refresh button. In essence it's just like looping over the function X times.

Another idea I had was using the clients remote ip in a session, at least that could limit just them to a one time request.

$remote_ip = $_SERVER['REMOTE_ADDR'];
if (strstr($remote_ip, ', ')) {
    $ips = explode(', ', $remote_ip);
    $remote_ip = $ips[0];
}

Yeah IP and unique Session ID is what I was trying, like trying to prevent further requests until the first one is finished, but attempting to limit it in any way ended up not working, because then i'd be stopping the "most recent" request object which in turn means the user gets no response at all.

Understood.

 

You don't really want to cache the api url itself, but the data within.

 

Don't want to give people something for nothing, still log each request.

 

I've been making a universal api front door so to say for all my api's.

Does the key check, remote ip, goes through clients allowed domains.

Then depending if is allowed will call on the particular api and it's url parameters to process the request.

Those cached results could be in a folder outside www and included versus being publicly viewable.

Even if saved the data as json or xml and parsed upon view versus something like a html cache.

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.