Jump to content

Octo

Members
  • Posts

    12
  • Joined

  • Last visited

    Never

Everything posted by Octo

  1. No errors on the page, just typical PHP errors "class not found on....". When I fix the classes, it doesn't find my 'view' files, so I just get a blank page with no errors on it, even with E_ALL on. Should probably set it to write some to an error.log actually. Enable parent paths didn't work I'm afraid, either on or off. Not had a chance to look into FastCGI yet. How do I find out whether it's using it or not. Thanks.
  2. Yeah, works fine on IIS7 (my test server is 7.5, the live server for the client is IIS6). As for FastCGI - I'm not a server admin, so while I have an idea what that is I'm not sure about how to set it up/check it's working right etc. However, I'll look into that since you mention it. (Note, the reason I'm doing the admin stuff is because they don't have one. I wrote the program and so am helping them set stuff up while I'm at it.)
  3. Using IIS 6, PHP is working in the virtual directory, except relative paths for things like file_exists. if (file_exists('another/file.php')) {echo "success";} require 'another/file.php'; echo $required_variable; // success From the test file above, include 'another/file.php' and require 'another/file.php' type statements will work fine in the v-dir, but not is_readable('another/file.php') or file exists('another/file.php) et al. These non-working paths will work if made absolute (C:/some/folders/another/file.php) though. The relative paths work fine in the main directory (all works as expected there). What's also unusual is that if I open a working mirror in the main directory (though in a different sub-directory), the broken one will start working as it borrows everything from the working one, even though it's set up site.com->test (v-dir) and site.com->live doc_root is empty, setting include_path seems to make no difference, putting './' at the front doesn't work. Have tried all the settings I can on the v-dir. Have checked MS documentation on how to set it up. Kind of at a loss here and any help would be appreciated.
  4. function createArr($depth,$key,$val){ $end = array($val); // so $end[0] = $val; $end = array_pad($end, $key*-1, 0); // adds $key number of 0s in front of the $val into the array making $end[$key] = $val; for ($i = 0; $i < $depth - 1; $i++) { $end = array($end); // stacks the array into another array $depth - 1 times; } return $end; } Something like that?
  5. You only need to go upwards. so 1 with all pairs of (2,3,4,5) 2 with all pairs of (3,4,5) and 3 with all pairs of (4,5) This means you don't need to check for duplicates as there won't be any. So, dunno, something like this should work with any array you put into it: (this is just off the top of my head, not actually tested it) function generate($numbers) { $sets = array(); // create container for results while (count($numbers) > 2){ // check array length so it stops when there's only two left $first = array_shift($numbers); // get first number by chopping off the first value of array $remainder = $numbers; // duplicate so we can keep chopping off in the next iteration while (count($remainder) > 1) { // length check again but only for a pair $second = array_shift($remainder); // get second number by chopping off first of remaining four numbers foreach ($numbers as $third){ $sets[] = "$first, $second, $third"; // for each in the remaining set create a triple and add to the output container } } } return $sets; // outputs an array of the values } $numbers = array(1,2,3,4,5); $alltriples = generate($numbers);
  6. Ack, sorry, I thought it read you didn't want to add another table for some reason. My apologies.
  7. I think you should considering laying out like this: Table 1 "Employees" employeeID/employeeDept / deptEmployeeRef / name / wage 1 / A / 2 / Dave Smith / 2000 2 / B / 2 / Robert Brown / 2500 Table 2 "Deductions" monthNo / employeeID / deduction 32 / 0 / 300 32 / 1 / 300 32 / 2 / 300 33 / 3 / 500 employeeID should be auto-incrementing so it's unique - no two employees will have the same. Having a ref number per dept. might be what you need, I don't know the situation, but having a unique identifier for each entry is advisable, the rest is changeable data. This would make your JOIN statement easier
  8. Your query statements are wrong. This is not correct, it should be "DELETE FROM table_name WHERE some_column = some_value" You have "DELETE FROM some_column WHERE code = some_value" DELETE is for deleting entire records at once, not individual entries in fields. Also, what is 'code'? And why must it equal the $key rather than the $value? Similar issue with these: INSERT is for entering entirely new records (rows) into a table not updating fields. For this you should use UPDATE. You INSERT INTO [tablename] and not a column name. Lastly - you have display errors off. If you had it on then MYSQL would probably tell you didn't have any tables called 'er', 'egg' or 'hatch' which would help you a lot. Ninja'd on the DB details by above.
  9. Apologies for the double post, just thought I'd post the solution: Turns out the hosting - after blaming it on my coding - had the server set up wrong so that it wasn't reading the php.ini properly, it was being overridden by another one even though phpinfo showed my changes propogating. This meant that when I tried to set up error logging as suggested I just couldn't get it to work for some reason. Exact same logging code worked fine on other servers and created a log file, just not this one. So the PHP ini was limiting it to two minutes, but when we fixed that it then timed out at five minutes. This turned out to be the IIS server which has a CGITimeout counter in the Metabase.xml set to five minutes. So I changed that. The company's hosting package didn't really come with any real support, so we were really left on our own. Thank god one of the support guys was nice enough look into and spotted it was set up wrong (think it php twice in the MIME types too) otherwise the client would've been screwed and kinda not their fault. Anyway, thanks for the replies.
  10. Test file is not corrupt (if only!). As mentioned, download via the url directly works fine and same test file works fine on other sites. Regarding error reporting - the @ wasn't in there until recently (I know what it's for) just forgot I put it on there. Will try things again without it and with all error reporting on, but didn't get any error reports before it sneaked it's way in there.
  11. Thanks for the reply Not had any errors display in any situation mentioned. Have tried fread in both a custom chunked function (from php.net page), and now just tried it non-chunked. Both downloads using this method halt at 51.4MB. Another thing: Have turned output-buffering off using ini_set (can you tell I'm clutching at straws here? )
  12. I wrote a small application to force downloads of various large video files using readfile. This worked fine. However, the users have changed hosts and now it doesn't work. Small files are fine, but large files will cut off before finishing. Test file is 114MB, but it is always cut off at exactly 67.2MB or 51.4MB depending on a couple criteria. I've stripped the application down to the key part that's not working and run it on a test page the just immediately goes the file - no logins or any of that shenanigans and htaccess blocking either: $file = some/file/on/the/server.wmv; //114MB hea der('Content-Description: File Transfer'); hea der("Content-Type: application/octet-stream"); hea der("Content-Disposition: attachment; filename=" .basename($file)); hea der("Content-Transfer-Encoding: binary"); hea der('Expires: 0'); hea der('Cache-Control: must-revalidate, post-check=0, pre-check=0'); hea der('Pragma: public'); hea der('Accept-Ranges: bytes'); hea der("Content-Length: ".filesize($file)); ob_clean(); flush(); @readfile($file); (gaps in the header words are to prevent forum breakage) I've tried the following variations: transfer encoding - chunked (this results in getting a 51.4 MB download rather than 67.2) content-type application/force-download tried the default force-download code from the php.net manual tried the chunking function variations on php.net readfile page tried without accept ranges originally (only spotted that on this site) set_time_limt to one hour set max memory to 300MB (phpinfo has shown these changes did get accepted) used htacess to disable gzip and deflate (this results in getting a 51.4 MB download rather than 67.2) (SetEnv no-gzip dont-vary or RewriteEngine On RewriteRule . - [E=no-gzip:1] or RemoveOutputFilter DEFLATE html txt xml css js php wmv) None of these have solved the issue. The code worked fine on the old site, works fine on my site, and works fine on my test server. Direct downloads from the broken site also work fine, just not readfile downloads. The filesize is reading correctly if I echo just that, and it reads properly on the progress bar when downloading through readfile. The hosting company are blaming my coding. The htaccess is the bit I'm least sure of and I find it hard to get solid info on this part on the web. Is there anything wrong with my code that could be causing this? Am I missing something? What could be causing the problem? Many Thanks
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.