Jump to content
NotionCommotion

Dealing with large JSON responses

Recommended Posts

I am using Guzzle as a HTTP client, and the following script results in the following error:

$response = $this->httpClient->request('GET', "http://$this->host:$this->port/query", ['query' => $data]);
$body = $response->getBody();
$rs=json_decode($body, true);
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 136956008 bytes) in /var/www/vendor/guzzlehttp/psr7/src/Stream.php on line 80

What are the work arounds?  Instead of trying to convert it into an array all at once, how can I do so in pieces?  I know the expected format so it seems I will need to read just the appropriate amount of bytes and then decode parts at a time.  Seems like a pain.  Are there any classes designed to do so, or can any of the following Guzzle built in methods be used?

Thanks

Guzzle Response methods:

  1. __construct
  2. getStatusCode
  3. getReasonPhrase
  4. withStatus
  5. getProtocolVersion
  6. withProtocolVersion
  7. getHeaders
  8. hasHeader
  9. getHeader
  10. getHeaderLine
  11. withHeader
  12. withAddedHeader
  13. withoutHeader
  14. getBody
  15. withBody

Guzzle Body methods:

  1. __construct
  2. __destruct
  3. __toString
  4. getContents
  5. close
  6. detach
  7. getSize
  8. isReadable
  9. isWritable
  10. isSeekable
  11. eof
  12. tell
  13. rewind
  14. seek
  15. read
  16. write
  17. getMetadata
     

Share this post


Link to post
Share on other sites
30 minutes ago, NotionCommotion said:

Instead of trying to convert it into an array all at once, how can I do so in pieces?

You can't really.  JSON is more or less an all-or-nothing encoding type, it can't easily be broken up into chunks.

If you don't need the JSON conversion and can just parse through it yourself for the data you need, you could use the read method on the body to receive the response piece by piece and look for the information you need.  You'd have to spend a fair bit of time essentially writing your own json parser which is less than ideal IMO.   Unless there's some reason not too, the best solution is to just increase your memory limit.

 

Share this post


Link to post
Share on other sites

One option is to dump the response to a temporary file and use something like jq (which I think operates on a stream, not a full document) to extract pieces of the JSON.

Share this post


Link to post
Share on other sites

Thanks kicken,  Not sure what will be enough memory, but will give it a try.  It is for a one-time use application to fix some data.

Thanks requnix,  Ever try doing it?  If so, did it work well and relativity simple?

Looks like there might be a few other options.  Haven't tried but will and will let you know if they work.


https://github.com/pcrov/JsonReader/wiki/JsonReader-API#psr7streampsrhttpmessagestreaminterface-stream-void

Quote

psr7Stream(\Psr\Http\Message\StreamInterface $stream): void
Initializes the reader with the given PSR-7 stream.
Throws an IOException if the given stream is not readable.


https://github.com/salsify/jsonstreamingparser with https://github.com/cerbero90/json-objects

Quote

 

$response = $guzzle->get('https://jsonplaceholder.typicode.com/users');

// Create a new instance by passing an implementation of MessageInterface
JsonObjects::from($response);

// Create a new instance by passing an implementation of StreamInterface
JsonObjects::from($response->getBody());

 


https://github.com/halaxa/json-machine

Quote

GuzzleHttp
Guzzle uses its own streams, but they can be converted back to PHP streams by calling\GuzzleHttp\Psr7\StreamWrapper::getResource(). Pass the result of this function to JsonMachine::fromStream function and you're set up. See working GuzzleHttp example.

 

Another one:  https://github.com/MAXakaWIZARD/JsonCollectionParser.  Also uses https://github.com/salsify/jsonstreamingparser

Edited by NotionCommotion

Share this post


Link to post
Share on other sites
1 hour ago, NotionCommotion said:

Thanks requnix,  Ever try doing it?  If so, did it work well and relativity simple?

I've used jq when dealing with JSON files from the command line, yes. Relatively simple.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.