Asheeown Posted January 28, 2007 Share Posted January 28, 2007 I have a script that puts sql query results into a file and then compresses it to "file.csv.gz" excels maximum row count is 65536 and the files i'm exporting are much larger, is their a way to split the results to make a new file every say 60000 results and put it into the same compressed file "file.csv.gz" Link to comment https://forums.phpfreaks.com/topic/36036-multiple-gz-files-in-one/ Share on other sites More sharing options...
Asheeown Posted January 28, 2007 Author Share Posted January 28, 2007 So i guess it's impossible? Link to comment https://forums.phpfreaks.com/topic/36036-multiple-gz-files-in-one/#findComment-171049 Share on other sites More sharing options...
trq Posted January 28, 2007 Share Posted January 28, 2007 Passing the -r option to tar appends file into an archive. Link to comment https://forums.phpfreaks.com/topic/36036-multiple-gz-files-in-one/#findComment-171059 Share on other sites More sharing options...
matto Posted January 28, 2007 Share Posted January 28, 2007 Do the results have to be opened with Excel - I know on windows Excel is the default handler for csv files, but you could also import the results into MS Access.....The answer to your question is yes.* Query table and get number of rows* Calculate how many files need to be generated (based on a maximum of 60000 lines per file)You would then need to create a funciton that would find the offset for each 60000 records - each time the function is called you could add the rowID for the 60000 record into a session variable - this will give you a starting point for when the function is called again and so on...... :) Link to comment https://forums.phpfreaks.com/topic/36036-multiple-gz-files-in-one/#findComment-171061 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.