mryno Posted February 13, 2009 Share Posted February 13, 2009 I have an array of results from MySQL. I want to loop through them until I find that one of the arrays' elements match a part of the URL requesting the page. My URL is: www.mypage.com/something/test1/index.php For Example: $results[0][id] = 'test1' $results[1][id] = 'test2' $results[2][id] = 'test3' When I am going through the results I want to stop when I find the record that matches (in case there are a lot). Or if you know a better way, that would be cool, too. Right now I am just foreach'ing through the results and when I find a match using strpos, I set that as the one I want. I would just rather not foreach through potentially thousands of records. Thanks for the help! Ryan Quote Link to comment Share on other sites More sharing options...
rhodesa Posted February 13, 2009 Share Posted February 13, 2009 so, why don't you get the string 'test1' out of the URL first, then use it in the WHERE clause of your MySQL query? Quote Link to comment Share on other sites More sharing options...
mryno Posted February 13, 2009 Author Share Posted February 13, 2009 The part that I am looking for could appear in different places. For example: www.mysite.com/something/test1/index.php www.othersite.com/test1/index.php ... The possibilities are pretty limitless based on the client and the way they want to set up their site. I am just trying to make it as flexible as possible. Again, if you know a better way to do that or if maybe it is better to parse the URL and search on those, let me know. I am not an expert on efficiency yet. :-) Quote Link to comment Share on other sites More sharing options...
allworknoplay Posted February 13, 2009 Share Posted February 13, 2009 That's probably how I would do it to. What about the "strstr" function? It seems to provide the same kind of functionality. I don't know which one would be "faster" though. Quote Link to comment Share on other sites More sharing options...
mryno Posted February 13, 2009 Author Share Posted February 13, 2009 That's probably how I would do it to. What about the "strstr" function? It seems to provide the same kind of functionality. I don't know which one would be "faster" though. You would parse the URL? The only problem is I would have to foreach and then query right? // parse URL to get the subdirectories and put them in the array $url pieces foreach $urlpieces { // Query each piece to see if it matches any SiteIDs in the DB } OR // Run query to get all SiteIDs as $results (1 fast query) foreach $results { // Match against the URL as a string using strpos or something similar } Is it faster to query multiple times (probably a maximum of 5) or is it faster to query once (getting possibly 100s of records) and have PHP go through them? Quote Link to comment Share on other sites More sharing options...
allworknoplay Posted February 13, 2009 Share Posted February 13, 2009 That's probably how I would do it to. What about the "strstr" function? It seems to provide the same kind of functionality. I don't know which one would be "faster" though. You would parse the URL? The only problem is I would have to foreach and then query right? // parse URL to get the subdirectories and put them in the array $url pieces foreach $urlpieces { // Query each piece to see if it matches any SiteIDs in the DB } OR // Run query to get all SiteIDs as $results (1 fast query) foreach $results { // Match against the URL as a string using strpos or something similar } Is it faster to query multiple times (probably a maximum of 5) or is it faster to query once (getting possibly 100s of records) and have PHP go through them? I would have done the latter, get all the info out of mysql and let PHP parse through it. Perhaps it's just my fear of overloading mysql too much. I never like mysql doing the heavy queries.... In fact, you can even have mysql do the queries and parsing with LIKE functions built into mysql, but that would put more load on the DB.... Quote Link to comment Share on other sites More sharing options...
corbin Posted February 13, 2009 Share Posted February 13, 2009 Uh, usually it's better to let MySQL do the grunt work. After all, MySQL is designed to do database stuff. (Not aimed at OP at all:) Something taking longer than expected in MySQL is usually a sign of bad indexing or poor design. Quote Link to comment Share on other sites More sharing options...
mryno Posted February 13, 2009 Author Share Posted February 13, 2009 I would have done the latter, get all the info out of mysql and let PHP parse through it. Perhaps it's just my fear of overloading mysql too much. I never like mysql doing the heavy queries.... In fact, you can even have mysql do the queries and parsing with LIKE functions built into mysql, but that would put more load on the DB.... That is my thought also. I try to keep the workload on PHP. A query within a foreach just seems unnatural to me. I have used it once or twice, but there are definitely TODO comments right next to them. Good point corbin. If I could use the LIKE or something like that, it would be worth looking into. I think in this specific case since there are a bunch of unknowns and I won't have a lot of control over what people put in their URLs, I better put the query in the hands of the known variables. I agree though. Usually probably better and faster to let MySQL handle it if you can get it into a query. Thanks for the help! Quote Link to comment Share on other sites More sharing options...
allworknoplay Posted February 13, 2009 Share Posted February 13, 2009 Uh, usually it's better to let MySQL do the grunt work. After all, MySQL is designed to do database stuff. (Not aimed at OP at all:) Something taking longer than expected in MySQL is usually a sign of bad indexing or poor design. Well like i said, it's just my fear of mysql. I just feel comfortable having PHP do the looping and parsing. Perhaps once i become mysql certified, my fear will go away the more I understand it.... Quote Link to comment Share on other sites More sharing options...
corbin Posted February 13, 2009 Share Posted February 13, 2009 I guess I just don't understand that fear. MySQL is made to manipulate databases. So, I let it do what it's made to do. Quote Link to comment Share on other sites More sharing options...
allworknoplay Posted February 13, 2009 Share Posted February 13, 2009 That is my thought also. I try to keep the workload on PHP. A query within a foreach just seems unnatural to me. I have used it once or twice, but there are definitely TODO comments right next to them. Thanks for the help! That would seem like it would kill the DB if you had a lot to loop through. I mean, I've had my share of hammering mysql so of course you can get away with small tables and putting queries in a loop. But it would seem to make more logical sense to get all the info out first, put it in an array, and loop through the data with PHP... Quote Link to comment Share on other sites More sharing options...
allworknoplay Posted February 13, 2009 Share Posted February 13, 2009 I guess I just don't understand that fear. MySQL is made to manipulate databases. So, I let it do what it's made to do. Yeah, I just like to use mysql for simple queries. For example, can you do math functions within mysql? Of course....would I want to? eh...I don't know...I'd probably pull the data out and have PHP do the calculations. Has anyone really done a benchmark to see which has better performance? I'm sure someone has.. Quote Link to comment Share on other sites More sharing options...
corbin Posted February 13, 2009 Share Posted February 13, 2009 In that specific case, it would depend on how large the result set is. What's better: Pulling 1,000,000 rows, or doing 100 queries that pull 1-3 rows a piece? Obvious in that case. I don't see why one would ever need to do a query in a loop. The only thing I can think of is some kind of hierarchical data. Edit: Doing the math out of MySQL would make sense if you plan on using all of those rows. If you need to decide which rows to pull by doing math on them, then I would do that in MySQL. For example: I would do: SELECT * FROM posts WHERE post_time > DATESUB(NOW(), INTERVAL 30 DAYS); in MySQL. Quote Link to comment Share on other sites More sharing options...
mryno Posted February 13, 2009 Author Share Posted February 13, 2009 That is my thought also. I try to keep the workload on PHP. A query within a foreach just seems unnatural to me. I have used it once or twice, but there are definitely TODO comments right next to them. Thanks for the help! That would seem like it would kill the DB if you had a lot to loop through. I mean, I've had my share of hammering mysql so of course you can get away with small tables and putting queries in a loop. But it would seem to make more logical sense to get all the info out first, put it in an array, and loop through the data with PHP... Absolutely true. I am not even to beta yet, so it just has to work with a very small load now for demos, but I am optimizing what I can so when there is a load I don't have to worry about hitting the DB so often. Most of that is my lack of MySQL skills on those 4th and 5th joins.... The issue with not doing it your way corbin is simply that who's to say that a client's url to this isn't going to be: www.clientsite.com/something/test1/test2/index.php or that one of their directory names contains the term I am searching...just too many uncontrollables on my part, but I do agree with you in general that MySQL should be used rather than using PHP to go through records. There is always a way with MySQL! Some of the data I have had to deal with IS hierarchical data, which doesn't help my lack of skills... Quote Link to comment Share on other sites More sharing options...
corbin Posted February 13, 2009 Share Posted February 13, 2009 So what happens when you have 5000 rows being processed in PHP, and you're getting 1000 concurrent requests at a time? That will slow Apache way down, depending on how the rows are being processed. Also, there is overhead of getting data from MySQL to PHP. It doesn't just magically jump there. Quote Link to comment Share on other sites More sharing options...
allworknoplay Posted February 13, 2009 Share Posted February 13, 2009 For example: I would do: SELECT * FROM posts WHERE post_time > DATESUB(NOW(), INTERVAL 30 DAYS); I like that query, I would do that in mysql as well....I also like to do timestamp comparisons in mysql. I guess it really depends on the project scope... Quote Link to comment Share on other sites More sharing options...
allworknoplay Posted February 13, 2009 Share Posted February 13, 2009 That is my thought also. I try to keep the workload on PHP. A query within a foreach just seems unnatural to me. I have used it once or twice, but there are definitely TODO comments right next to them. Thanks for the help! That would seem like it would kill the DB if you had a lot to loop through. I mean, I've had my share of hammering mysql so of course you can get away with small tables and putting queries in a loop. But it would seem to make more logical sense to get all the info out first, put it in an array, and loop through the data with PHP... Absolutely true. I am not even to beta yet, so it just has to work with a very small load now for demos, but I am optimizing what I can so when there is a load I don't have to worry about hitting the DB so often. Most of that is my lack of MySQL skills on those 4th and 5th joins.... I don't know how big your DB will get, but make sure you design your tables properly as well using correct column types etc... things like that can really add up when you have millions of records... Quote Link to comment Share on other sites More sharing options...
allworknoplay Posted February 13, 2009 Share Posted February 13, 2009 So what happens when you have 5000 rows being processed in PHP, and you're getting 1000 concurrent requests at a time? That will slow Apache way down, depending on how the rows are being processed. Also, there is overhead of getting data from MySQL to PHP. It doesn't just magically jump there. So what happens when you have 5000 rows being processed in PHP, and you're getting 1000 concurrent requests at a time? That will slow Apache way down, depending on how the rows are being processed. Also, there is overhead of getting data from MySQL to PHP. It doesn't just magically jump there. Good point. 1) Use load balancer!! ;D 2) You can also split up your DB into reads and writes so that one mysql node is used for reading, and another for writing, but that is going out of the scope of this post. 3) The real answer is probably super complex joins and a mixture of clean PHP coding... Quote Link to comment Share on other sites More sharing options...
mryno Posted February 13, 2009 Author Share Posted February 13, 2009 I edited my previous post, but I didn't make it in time: The issue with doing it your way corbin is simply that who's to say that a client's url to this isn't going to be: www.clientsite.com/something/test1/test2/index.php or that one of their directory names contains the term I am searching...just too many uncontrollables, but I do agree with you in general that MySQL should be used rather than using PHP to go through records. There is always a way with MySQL! Some of the data I have had to deal with IS hierarchical data, which doesn't help my lack of skills... In my experience with large-scale (100's of millions) record sets, it is beyond a doubt faster to use MySQL. It probably doesn't matter (isn't noticeably faster to the end user) as much with smaller sets (100's to 1000's) of data. MySQL is still faster in the smaller sets, but what is a tenth of a second? Quote Link to comment Share on other sites More sharing options...
rhodesa Posted February 13, 2009 Share Posted February 13, 2009 no, i would do it like so. it's all one statement to build the SQL, but i separated it into multiple lines so i could comment up each part. remember to read it from the middle to the top as that is how the parts will be processed. the only part you may want to tweak is step 2...not sure if you wanted the filename or just the directories in the search: $sql = sprintf("SELECT * FROM tableName WHERE id IN ('%s')", // 7. Puts it into the SQL statement implode("','", // 6. Joins them back together with quotes and commas array_map('mysql_real_escape_string', // 5. Runs everything through mysql_real_escape_string to be safe explode('/', // 4. Splits the parts on the slash substr( // 3. Removes the leading slash dirname( // 2. Strips off the file, so it's just the folder path parse_url($_SERVER['PHP_SELF'],PHP_URL_PATH) // 1. Gets the path of the script from the URL ),1 ) ) ) ) ); echo $sql; edit: the url http://www.mypage.com/something/test1/index.php would produce SELECT * FROM tableName WHERE id IN ('something','test1') Quote Link to comment Share on other sites More sharing options...
mryno Posted February 13, 2009 Author Share Posted February 13, 2009 no, i would do it like so. it's all one statement to build the SQL, but i separated it into multiple lines so i could comment up each part. remember to read it from the middle to the top as that is how the parts will be processed. the only part you may want to tweak is step 2...not sure if you wanted the filename or just the directories in the search: $sql = sprintf("SELECT * FROM tableName WHERE id IN ('%s')", // 7. Puts it into the SQL statement implode("','", // 6. Joins them back together with quotes and commas array_map('mysql_real_escape_string', // 5. Runs everything through mysql_real_escape_string to be safe explode('/', // 4. Splits the parts on the slash substr( // 3. Removes the leading slash dirname( // 2. Strips off the file, so it's just the folder path parse_url($_SERVER['PHP_SELF'],PHP_URL_PATH) // 1. Gets the path of the script from the URL ),1 ) ) ) ) ); echo $sql; edit: the url http://www.mypage.com/something/test1/index.php would produce SELECT * FROM tableName WHERE id IN ('something','test1') Consider my world rocked! Very simple and elegant. That would have taken me forever. Great code Rhodesa. Quote Link to comment Share on other sites More sharing options...
rhodesa Posted February 13, 2009 Share Posted February 13, 2009 normally, i tend to stay away from dynamic lists in queries, since a lot of values in a WHERE IN ( ) clause can be slow. but since it's a URL, i assume it won't ever get over a handful of values Quote Link to comment Share on other sites More sharing options...
corbin Posted February 13, 2009 Share Posted February 13, 2009 Yeah, I think I mentioned an IN() clause earlier, but Rhodesa covered it much better ;p. Quote Link to comment Share on other sites More sharing options...
mryno Posted February 13, 2009 Author Share Posted February 13, 2009 Yeah, I think I mentioned an IN() clause earlier, but Rhodesa covered it much better ;p. Props are definitely in order. I think I just wasn't on the same page when you mentioned it. Nailed it! Quote Link to comment Share on other sites More sharing options...
rhodesa Posted February 13, 2009 Share Posted February 13, 2009 Yeah, I think I mentioned an IN() clause earlier, but Rhodesa covered it much better ;p. corbin...always trying to steal some of my thunder... Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.