NotionCommotion Posted November 19, 2019 Share Posted November 19, 2019 I am using Guzzle as a HTTP client, and the following script results in the following error: $response = $this->httpClient->request('GET', "http://$this->host:$this->port/query", ['query' => $data]); $body = $response->getBody(); $rs=json_decode($body, true); Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 136956008 bytes) in /var/www/vendor/guzzlehttp/psr7/src/Stream.php on line 80 What are the work arounds? Instead of trying to convert it into an array all at once, how can I do so in pieces? I know the expected format so it seems I will need to read just the appropriate amount of bytes and then decode parts at a time. Seems like a pain. Are there any classes designed to do so, or can any of the following Guzzle built in methods be used? Thanks Guzzle Response methods: __construct getStatusCode getReasonPhrase withStatus getProtocolVersion withProtocolVersion getHeaders hasHeader getHeader getHeaderLine withHeader withAddedHeader withoutHeader getBody withBody Guzzle Body methods: __construct __destruct __toString getContents close detach getSize isReadable isWritable isSeekable eof tell rewind seek read write getMetadata Quote Link to comment https://forums.phpfreaks.com/topic/309551-dealing-with-large-json-responses/ Share on other sites More sharing options...
kicken Posted November 19, 2019 Share Posted November 19, 2019 30 minutes ago, NotionCommotion said: Instead of trying to convert it into an array all at once, how can I do so in pieces? You can't really. JSON is more or less an all-or-nothing encoding type, it can't easily be broken up into chunks. If you don't need the JSON conversion and can just parse through it yourself for the data you need, you could use the read method on the body to receive the response piece by piece and look for the information you need. You'd have to spend a fair bit of time essentially writing your own json parser which is less than ideal IMO. Unless there's some reason not too, the best solution is to just increase your memory limit. Quote Link to comment https://forums.phpfreaks.com/topic/309551-dealing-with-large-json-responses/#findComment-1571718 Share on other sites More sharing options...
requinix Posted November 19, 2019 Share Posted November 19, 2019 One option is to dump the response to a temporary file and use something like jq (which I think operates on a stream, not a full document) to extract pieces of the JSON. Quote Link to comment https://forums.phpfreaks.com/topic/309551-dealing-with-large-json-responses/#findComment-1571719 Share on other sites More sharing options...
NotionCommotion Posted November 19, 2019 Author Share Posted November 19, 2019 (edited) Thanks kicken, Not sure what will be enough memory, but will give it a try. It is for a one-time use application to fix some data. Thanks requnix, Ever try doing it? If so, did it work well and relativity simple? Looks like there might be a few other options. Haven't tried but will and will let you know if they work. https://github.com/pcrov/JsonReader/wiki/JsonReader-API#psr7streampsrhttpmessagestreaminterface-stream-void Quote psr7Stream(\Psr\Http\Message\StreamInterface $stream): void Initializes the reader with the given PSR-7 stream. Throws an IOException if the given stream is not readable. https://github.com/salsify/jsonstreamingparser with https://github.com/cerbero90/json-objects Quote $response = $guzzle->get('https://jsonplaceholder.typicode.com/users'); // Create a new instance by passing an implementation of MessageInterface JsonObjects::from($response); // Create a new instance by passing an implementation of StreamInterface JsonObjects::from($response->getBody()); https://github.com/halaxa/json-machine Quote GuzzleHttp Guzzle uses its own streams, but they can be converted back to PHP streams by calling\GuzzleHttp\Psr7\StreamWrapper::getResource(). Pass the result of this function to JsonMachine::fromStream function and you're set up. See working GuzzleHttp example. Another one: https://github.com/MAXakaWIZARD/JsonCollectionParser. Also uses https://github.com/salsify/jsonstreamingparser Edited November 19, 2019 by NotionCommotion Quote Link to comment https://forums.phpfreaks.com/topic/309551-dealing-with-large-json-responses/#findComment-1571720 Share on other sites More sharing options...
requinix Posted November 20, 2019 Share Posted November 20, 2019 1 hour ago, NotionCommotion said: Thanks requnix, Ever try doing it? If so, did it work well and relativity simple? I've used jq when dealing with JSON files from the command line, yes. Relatively simple. Quote Link to comment https://forums.phpfreaks.com/topic/309551-dealing-with-large-json-responses/#findComment-1571721 Share on other sites More sharing options...
NotionCommotion Posted November 20, 2019 Author Share Posted November 20, 2019 I gave json-machine a try and it works great. You just pass it the steam and an optional json pointer and iterate over the elements. Quote Link to comment https://forums.phpfreaks.com/topic/309551-dealing-with-large-json-responses/#findComment-1571753 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.