kael.shipman Posted June 1, 2007 Share Posted June 1, 2007 Hi all, I've been wondering this for a while and have searched everywhere for it: I really like to break my code up into files that are less than a million lines long, so I've divided my site into "switch" files (so called because they redirect the path of execution like an old telephone switch board). I figure this could win out in the end because PHP doesn't parse include files that aren't relevant to the current script path (i.e., it won't parse "this.php" in this case: if (1 == 0) { include 'this.php'; }), so it can save a lot of parsing, but at the same time, I don't know how the include mechanism works, and I don't want to burden the server unnecessarily by including every word in a separate file. What I'm looking for is a reasonable rule for the optimum balance of includes and file length. Here's my example (please ignore any violations to best-practices, as this is a speedily-contrived example): //INDEX.PHP //STATIC CONTENT FOR ENTIRE SITE $s = $_GET['s']; if (!isset($s)) { $s = 1; } include 'includes/header.php'; if ($s == 1) { //Please ignore the fact that I could use an array with file names in it include 'includes/sections/sec1.php'; } elseif ($s == 2) { include 'includes/sections/sec2.php'; } elseif ($s == 3) { .... } include 'inlcludes/footer.php'; //INCLULDES/SECTIONS/SECTION1.PHP //STATIC CONTENT FOR SECTION $p = $_GET['p']; if (!isset($p)) { $p = 1; } if ($p == 1) { //Include appropriate page file include 'includes/pgs/pg1.php'; } elseif ($p == 2) { include 'includes/pgs/pg2.php'; } elseif ($p == 3) { .... } else { include 'includes/error.php'; } //INCLUDES/PAGES/PAGE1.PHP //STATIC CONTENT FOR PAGE $subSec = $_GET['subSec']; if (!isset($subSec)) { $subSec = 0; } if ($subSec == 1) { //Include appropriate subsection file include 'includes/subSecs/ss1.php'; } elseif ($subSec == 2) { include 'includes/subSecs/ss2.php'; } elseif ($subSec == 3) { .... } else { include 'includes/error.php'; } In this case, I could have all of that set apart in if statements in one file, but then it would have to parse all of it and execute 15% of the file. Now each file is only 10 lines long, but php has to dig deep into the include nest to find everything. To reiterate, I thought that using 95% of all parsed code in the form of an include nest would be better than using 15% of all parsed code in the form of one large file, but I was uncertain as to whether the overhead of including files would eventually (think really big here, like 20 includes deep and several hundred files laying around) overtake the efficiency gained. Any suggestions? Thanks, Kael Quote Link to comment https://forums.phpfreaks.com/topic/53871-efficiency-of-multiple-includes/ Share on other sites More sharing options...
dustinnoe Posted June 1, 2007 Share Posted June 1, 2007 Have you thought about using microtime()? Just grab the time at the beginning of the script and then when it is complete. Try this with both scripts and see which is faster. You can even grab more times at key points in your script if you really want to see whats happening. I assume you are looking for improvement in execution speed. Hope that helps! Quote Link to comment https://forums.phpfreaks.com/topic/53871-efficiency-of-multiple-includes/#findComment-266342 Share on other sites More sharing options...
emehrkay Posted June 1, 2007 Share Posted June 1, 2007 i like your approach to including files as needed. i, and many others, do the same thing. you are right to say that using 95% of the files included through this method is better than 15% of all files included. If Im not mistaken, include/require is better than include/require_once Quote Link to comment https://forums.phpfreaks.com/topic/53871-efficiency-of-multiple-includes/#findComment-266350 Share on other sites More sharing options...
kael.shipman Posted June 1, 2007 Author Share Posted June 1, 2007 Thanks for the quick reply! The include scheme certainly does make code cleaner. It's the biggest headache trying to look through a jumbled 1500 line file for the right comment or code block or something. Then again, with too many includes, you start needing an integrated environment for even basic management.... Trade-offs abound! I think I'll stick with the includes, though. With regard to dustinnoe: I haven't tried timing anything yet, but mainly because I'd have to have practically a production-scale example for it. I think it's a good idea, but I'm going to have to wait until I have time to build a big example using both schemes. I was just wondering if there was something someone knew about include() that maybe I didn't (like if it has a disproportionately high overhead or something). Thanks again! Quote Link to comment https://forums.phpfreaks.com/topic/53871-efficiency-of-multiple-includes/#findComment-266423 Share on other sites More sharing options...
kael.shipman Posted June 2, 2007 Author Share Posted June 2, 2007 Actually, while we're on it, I just thought up a better example: I had this idea for a completely modular function catalog (perhaps it's not an original idea, but I learned to code "on the street", so I don't know any of the cool little tips and tricks). I know that having a code library is cool, so I tried putting all my functions into a central location so I could reference them easily. I found, however, that I ended up essentially opening up a library file, taking a code snippet from it, then making a new file with that. Instead of keeping my library organized in that way, I was thinking maybe I could have - as I mentioned earlier - a modular library, where EVERY function is contained in its own file in a central "functions" folder. Then when I build a page, instead of making a custom file from code snippets, all I'd have to do is lay down a list of includes for each function I wanted to pull into the page's function catalog. Example: <?php //functions.php (included by index.php) @include 'includes/functions/radStr.php'; @include 'includes/functions/mergeArrayFull.php'; @include 'includes/functions/sql_query.php'; @include 'includes/functions/lnkUrl.php'; .... etc. ?> Do you suppose that might be going overboard with the includes concept, considering there might be as many as a hundred functions included? The burden of writing the file could be avoided by simply making a (local) script that lists the functions in your functions directory with checkboxes, so you could manage the file that way and not even have to worry about the extreme carpal tunnel you'd get by typing all that. That, too, might cut down on wasted parsing, since a lot of times I find that I'm including my 400-line functions file to gain use of TWO functions. I do it anyway so that when I change the functions, I don't have to do it in 20 places. Thanks again, Kael Quote Link to comment https://forums.phpfreaks.com/topic/53871-efficiency-of-multiple-includes/#findComment-267164 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.