Jump to content

roopurt18

Staff Alumni
  • Posts

    3,746
  • Joined

  • Last visited

    Never

Everything posted by roopurt18

  1. I agree with you on the same-brand idea cmg; I have a wired d-link router which I purchased after a nutty Linksys that made my hair fall out. I forgot to mention in the first post, I'm more interested in solutions that allow me to use the devices without having to have other computers turned on. It's the case 9 times out of 10 that both desktops are running; but that 1 out of 10 times where I need to wait for a computer to turn on before I can print makes me go crazy. Then again, maybe I'm over complicating the issue. All I ever need to do is print; my fiance is the one that wants to fax and scan. I do have a print server shut away in the closet somewhere, I was just really curious if there existed similar devices for other USB peripherals.
  2. What are the options these days for going wireless with peripherals? I have a fax machine, scanner, and printer I'd like to share between two desktops and a laptop. I'm doing research on my own, but it's always nice to hear directly from someone with more experience in an area. Thanks!
  3. As long as you used it in conjunction with another function that stripped the HTML tags of it, I don't see a problem. Sometimes when writing a tutorial the author wants to cover everything in depth, but certain subjects would shoot off into a huge tangent of information irrelevant to the focus of the tutorial. Slap on the wrist for the author's not mentioning that its up to the reader to research security on their own though.
  4. Why don't you write it so either will work? If the named keys can not be found, assume a numerically indexed array. Or write two different functions.
  5. No. RewriteRule downloads/.* /path/to/script/downloads.php?file=$1
  6. Hah! So there was an error! <?php $divs = explode(";", $some_var); $ext = trim($divs[1]); ?> The assignment to $ext was giving an "index out of bounds" error when $some_var didn't have a semi-colon in it.
  7. I've removed all of the @ symbols and placed this at the top of the function: <?php // %% set_error_handler to debug problem, restore_error_handler // %% below $f = create_function('', 'echo "an error occured"; exit();'); // %% temp error handler set_error_handler("{$f}"); // %% temp error handler ?> and this at the bottom: <?php fpassthru($fp); restore_error_handler(); // %% remove when we remove set_error_handler exit(); ?> Hopefully this will help me determine if its a browser or server issue.
  8. CentOS I believe. phpinfo() Linux 2.6.9-023stab040.1-enterprise #1 SMP Tue Jan 16 01:09:22 MSK 2007 i686
  9. Our site allows clients to upload files to share with others. We have several clients who are sharing many files with many users. There is one client that inconsistently receives a blank white screen while viewing a file. She is the only client that has reported this as an issue, so I'm not sure how wide spread it is; I can not duplicate this problem myself but I can watch it happen to her using remote desktop software. She first reported it last week. I went into the function that serves the file, added two lines of code, removed those same two lines, saved the file, and the problem went away. She called again this morning reporting the same problem. This time I modified the code to pass along extra header information and that seemed to clear the problem up. She called again about 2 or 3 hours later and the problem had started yet again, without any changes to the code. Here is the original function that serves the file: <?php // ShowUploadedDoc // $doc_id - the id of the document // Pass the document through to the viewer function ShowUploadedDoc($doc_id){ $fp = @fopen(SUBVIEW_DATA_PATH . $doc_id,'r'); $conttype = CFileManagerDAO::_GetMime($doc_id, SUBVIEW_DATA_PATH); $fp = @fopen(SUBVIEW_DATA_PATH . $doc_id,'r'); header('Content-type: '.$conttype['full']); header('Content-length: '.(string)(@filesize(SUBVIEW_DATA_PATH . $doc_id))); @fpassthru($fp); return; } ?> Here is the new function: <?php // ShowUploadedDoc // $doc_id - the id of the document // Pass the document through to the viewer function ShowUploadedDoc($doc_id){ $doc = @SubviewDocumentDAO::GetDocument($doc_id); $conttype = @CFileManagerDAO::_GetMime($doc_id, SUBVIEW_DATA_PATH); $fp = @fopen(SUBVIEW_DATA_PATH . $doc_id,'r'); $cname = $doc["doc_name"]; $ctype = $conttype["full"]; $clen = (string)(@filesize(SUBVIEW_DATA_PATH . $doc_id)); @header("Content-disposition: attachment; filename=\"{$cname}\""); @header("Cache-Control: cache, must-revalidate"); @header("Content-type: {$ctype}"); @header("Content-Length: {$clen}"); @header("Pragma: public"); @header("Expires: 0"); @fpassthru($fp); exit(); } ?> Suggestions?
  10. I probably won't revisit this query for a while since I'm happy with the current results. The whole page used to take 7 or 8 seconds to load and now it's down to about 1.5s. I'll keep this in the back of my mind in case I do have to come back to it or face a similar situation elsewhere. Again, thanks to everyone for all their help.
  11. I've now removed the multi-column index I created earlier on wv_user and restructured the query: SELECT u.user_wvid AS UserID, LOWER(u.login) AS Login, LOWER(u.password) AS `Password`, u.subname AS SubName, s.full_name AS FullName, SUM(IF(t.onsite=1, 1, 0)) AS HasOnSites, SUM(IF(t.onsite=0, 1, 0)) AS HasOffSites, CASE WHEN SUM(IF(t.onsite=1, 1, 0)) > 0 AND SUM(IF(t.onsite=0, 1, 0)) > 0 THEN 'both' WHEN SUM(IF(t.onsite=1, 1, 0)) > 0 AND SUM(IF(t.onsite=0, 1, 0)) = 0 THEN 'onsite' WHEN SUM(IF(t.onsite=0, 1, 0)) > 0 AND SUM(IF(t.onsite=1, 1, 0)) = 0 THEN 'offsite' END AS SubType, IFNULL( ( SELECT a.TimeRecord AS LastActiveStamp FROM UserActivity a WHERE a.UserLogin=u.login ORDER BY a.TimeRecord DESC LIMIT 1 ), '-') AS LastActiveStamp, IFNULL( DATE_FORMAT( ( SELECT a.TimeRecord AS LastActiveStamp FROM UserActivity a WHERE a.UserLogin=u.login ORDER BY a.TimeRecord DESC LIMIT 1 ), '%a. %b. %D, %Y' ), '-') AS LastActiveDisp FROM wv_user u CROSS JOIN wssubc s ON u.subname=s.subname CROSS JOIN SubProjectAccess pa ON u.subname=pa.SubName CROSS JOIN wstrades t ON pa.Trade=t.tcode WHERE u.role_wvid=2 GROUP BY u.login ORDERY BY _xyz_ LIMIT 0,25 I've structured the query in this manner because there is an external parameter that determines the ORDER BY, which can be any of the fields returned by the query (UserID, Login, LastActiveStamp, etc). Using any of the fields with ORDER BY will cause using temporary to show up in EXPLAIN, but it's consistently faster than 1s so I'm happy with the results for now. Also, I had forgotten about using sub-queries since our last host was MySQL 4.0.x, thanks for reminding me!
  12. I placed a multi-column index (role_wvid, login) on wv_user and the using temporary; using filesort has been eliminated. Would someone be willing to offer a brief explanation or link to an explanation of why the multi-column index made a difference in this case? I'm still a bit of a noob when it comes to DB optimization. Also, if the LIMIT is left off the query or if the second part of the limit extends the actual number of records (LIMIT 100,25 on a table with 102 records), I notice that using temporary; using filesort makes its way back into EXPLAIN. What would be the cause of this? Would it be more efficient to precede this query with another that counts the number of records in wv_user and stops the LIMIT from extending beyond the actual number of records?
  13. I'm not counting my eggs before they hatch on this one.
  14. I'm not at work anymore so I'll have to try your suggestions tomorrow, but... MAX(a.TimeRecord) AS LastActiveStamp, IFNULL(DATE_FORMAT(a.TimeRecord, '%a. %b. %D, %Y'), '-') AS LastActiveDisp I can't be sure without the code in front of me, but I think the program itself needs the timestamp as well, which is why I'm returning two separate fields there. As for the variables, my original query wasn't using them and I had added them thinking it would probably be more optimal. Is there any reason why that's not the case? I'll report back on the other stuff some time tomorrow. Thanks for the responses.
  15. I've got this query that reports details on a group of users within our system I'd like to optimize. On average, the query takes 3.5 - 4.25s to run. SELECT you.user_wvid AS UserID, LOWER(you.login) AS Login, LOWER(you.password) AS `Password`, you.subname AS SubName, s.full_name AS FullName, @onsites:=SUM(IF(t.onsite=1, 1, 0)) AS HasOnSites, @offsites:=SUM(IF(t.onsite=0, 1, 0)) AS HasOffSites, CASE WHEN @onsites > 0 AND @offsites > 0 THEN 'both' WHEN @onsites > 0 AND @offsites = 0 THEN 'onsite' WHEN @offsites > 0 AND @onsites = 0 THEN 'offsite' END AS SubType, MAX(a.TimeRecord) AS LastActiveStamp, IFNULL(DATE_FORMAT(a.TimeRecord, '%a. %b. %D, %Y'), '-') AS LastActiveDisp FROM wv_user you CROSS JOIN wssubc s ON you.subname=s.subname CROSS JOIN SubProjectAccess pa ON you.subname=pa.SubName CROSS JOIN wstrades t ON pa.Trade=t.tcode LEFT JOIN UserActivity a ON you.login=a.UserLogin WHERE you.role_wvid=2 GROUP BY you.login ORDER BY you.login LIMIT 0,25 EXPLAIN gives... I'm trying to get rid of the using temporary in the EXPLAIN results but not having much luck. Any suggestions?
  16. You can do what you want: <?php // These will create: $days0, $days1, $days2, $days3, $days4, ... for($i = 0; $i < 10; $i++){ ${"days" . $i} = $i; } ?> However, I think you will find your life easier if you store each row of data as an element in an array: <?php $rows = Array(); for($i = 0; $i < 10; $i++){ $rows[] = Array( "date" => $date_val, "location" => $location_val, "hours" => $hours_val ); } ?> This will allow you to use a foreach while building your table: <?php foreach($rows as $row){ echo " <tr> <td><input name=\"{$row['date']}\" type=\"text\" size=\"10\" /></td> <td><input name=\"{$row['location']}\" type=\"text\" size=\"25\" /></td> <td><input name=\"{$row['hours']\" type=\"text\" size=\"5\" /></td> </tr> ";} ?>
  17. Use string concatenation. <?php // Long version $url = $_SERVER['PHP_SELF']; $url .= "?var=foo"; $url .= "&var2=bar"; // Short version $url = $_SERVER['PHP_SELF'] . "?var=foo&var2=bar"; ?>
  18. I added a new feature to the product we sell recently; after programming for two weeks I was curious as to just how much code I'd written. I wrote a short script that opens up all of the source files I had added to our project in adding this feature; as each file is opened it's contents are concatenated to a string variable and the final contents of that var are dumped into another file. To give you an idea of how fast a web server processes files, I added a time check at the beginning and end of that script. Time required to open 30 files, concatenate their contents, and dump the 4600 lines of code back into a dump file: 0.63623100 1179511579 (microtime - start) 0.64138600 1179511579 (microtime - end) Total time required: ~5 one thousandths of a second Granted, this doesn't add the complexity of parsing the files' contents as PHP, but the bottom line is if you have optimization concerns, worry about your DB and algorithms before you worry about the size of your source code.
  19. Horrible idea; don't do that. Organize the functions into logical units and then expose them through "namespaces;" I've quoted namespaces as PHP doesn't actually support them, but you can fake the behavior with classes. For example, let's say you had the following functions: <?php a(); // regexp function b(); // regexp function c(); // html helper function d(); // sanitizing function e(); // html helper // and so on ?> You basically have three categories of functions there: regexp, html helper, & sanitizing. For each category, you create a separate file where the filename starts with "N" followed by the category: NRegExp.php, NHTML.php, NSanitizer.php NRegExp.php <?php class NRegExp{ function a(){ // Function a body } function b(){ // Function b body } } ?> I recommend storing each of these files within a "namespaces" directory and declaring a constant: define("DIR_NAMESPACE", "/path/to/dir"); Now, to access your regexp library in a script: <?php require_once(DIR_NAMESPACE . "/NRegExp.php"); // Some php NRegExp::a(); // Call func a // Some more php ?> I see it time and time again where people creating scripts are concerned about the number of files, or the size of included files, within your scripts. You should not concern yourself with this. You have a greater potential to bring your site to it's knees by having a bad database setup or bad algorithms within your code than including a bunch of files.
  20. PHP runs on the server and doesn't output anything until the script is finished, so sticking a while loop in a PHP script will not help you with your goal. (That statement is 99% true, btw). I would set the processing page up as a simple animated .gif so the user feels that something is happening. The page itself should invoke AJAX requests via a timer. The requests should return a simple 0 or 1 if the request has finished; your AJAX handler can redirect to the final confirmation page when it has received a 1 from the server. Of course, once you go this route, you have to make sure the user's browser supports AJAX. If it doesn't, you need to do it the old fashioned method, so to speak.
  21. The PHP5 executable resides somewhere on the host. If you are able to find it, you could just use the direct path to execute your scripts. For instance, say the executable was in /usr/bin/php5/, you could do: /usr/bin/php5/php my_script.php
  22. I highly recommend picking up a decent book or manual on Javascript. I was so-so with Javascript a year ago; since then I've read O'Reilly's JS reference and done quit a bit more programming in Javascript. When I go back and look at my previous code, it's absolutely horrendous. Not that my code now is always the greatest thing since sliced bread, but the difference that extra knowledge makes is worth the time invested. I only bring this up because you will have to use some Javascript to get AJAX working correctly and it might as well be efficient. Personally, I use the XHR object to make requests to the server and return the responses as JSON. Using JSON is much more compact and easier instead of trying to do all of your communication using XML.
  23. I would ask your client for as many house descriptions as possible and look for the common trends. You will likely find certain features are common to all houses and certain features are common to certain types of houses. That will give you a better idea of how you should organize your data. You can't possibly develop a sufficient database if you don't know anything about the data you'll be storing.
  24. Our development server is a VPS. Something on the server recently caused it to run out of memory, although I'm not sure what. Our host had to restart the machine for us and I can run a command that shows me my current memory usage as well as the most recent max usage, but is there anything I can use to track down which process ate up all the memory?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.